Compare commits

...

113 Commits

Author SHA1 Message Date
fd56768fdc Bump version to 0.88.1 (#11303) 2023-12-14 18:14:47 +01:00
070afa4528 Revert lock file changes due to openssl build fail (#11328)
A theoretically semver compatible version bump of dependencies in the
`Cargo.lock` was part of #11302. The particular upstream openSSL version
pulled in by that fails to build on our RISC-V build config (3.2.0)

- https://github.com/nushell/nushell/pull/11302#issuecomment-1856011363
-
https://github.com/nushell/nightly/actions/runs/7210339119/job/19643253780

So far no updated openssl release is available or it is unclear how the
runner needs to be configured to avoid the issue described in:

- https://github.com/openssl/openssl/issues/22871

Out of abundance of caution let's try to revert this version bump and
try to figure out a solution later, so we can ship our `0.88.1` patch
release.
2023-12-14 18:09:28 +01:00
23fec8eb0d Fix piping output logic (#11317)
# Description
Fixes: #11295

Sorry for introducing such issue.
The issue is caused by we wrongly set `redirect_stdout` and
`redirect_stderr` during eval, take the following as example:
```nushell
ls | bat --paging always
```
When running `bat --paging always`, `redirect_stdout` should be `false`.
But before this pr, it's set to true due to `ls` command, and then the
`true` value will go to all remaining commands.

# User-Facing Changes
NaN

# Tests + Formatting
Sorry I don't think we have a way to test it. Because it needs to be
tested on interactive command like `nvim`.

# After Submitting
NaN
2023-12-13 13:16:23 -06:00
c9841f67e9 fix-open-is-ambiguous (#11302)
# Description
Running `cargo install nu` on rustc 1.71.1 (eb26296b5 2023-08-03)
produced the following error message when reaching `nu-command v0.88.0`:
```sh
error[E0659]: `open` is ambiguous
  --> /home/antoine/.cargo/registry/src/index.crates.io-6f17d22bba15001f/nu-command-0.88.0/src/stor/mod.rs:30:9
   |
30 | pub use open::StorOpen;
   |         ^^^^ ambiguous name
   |
   = note: ambiguous because of multiple potential import sources
   = note: `open` could refer to a crate passed with `--extern`
   = help: use `::open` to refer to this crate unambiguously
note: `open` could also refer to the module defined here
  --> /home/antoine/.cargo/registry/src/index.crates.io-6f17d22bba15001f/nu-command-0.88.0/src/stor/mod.rs:12:1
   |
12 | mod open;
   | ^^^^^^^^^
   = help: use `self::open` to refer to this module unambiguously

   Compiling nu-cli v0.88.0
   Compiling nu-lsp v0.88.0
For more information about this error, try `rustc --explain E0659`.
error: could not compile `nu-command` (lib) due to previous error
warning: build failed, waiting for other jobs to finish...
error: failed to compile `nu v0.88.0`, intermediate artifacts can be found at `/tmp/cargo-installEUsfIr`
```

# User-Facing Changes
No user-facing changes.
2023-12-13 13:08:00 -06:00
76482cc1b2 Move stor commands to category Database (#11315)
Fixes #11309
2023-12-13 16:24:16 +01:00
d43f4253e8 Bump version for 0.88.0 release (#11298)
- [x] reedline
  - [x] released
  - [x] pinned
- [x] git dependency check
- [x] release notes
2023-12-13 06:31:14 +13:00
0fba08808c bump reedline dep to 0.27 (#11299)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-12-13 06:30:58 +13:00
b3a52a247f 📝 Fix logical error in help glob (#11286) 2023-12-11 06:42:55 -06:00
4763801cb2 add nothing -> table to format date (#11290)
this will allow to run
```nushell
format date --list | get 0
```
and get
```
─────────────┬───────────────────────────────────────────────────────────
Specification│%Y
Example      │2023
Description  │The full proleptic Gregorian year, zero-padded to 4 digits.
─────────────┴───────────────────────────────────────────────────────────
```
instead of currently
```
Error: nu::parser::input_type_mismatch

  × Command does not support string input.
   ╭─[entry #2:1:1]
 1 │ format date --list | get 0
   ·                      ─┬─
   ·                       ╰── command doesn't support string input
   ╰────
```
2023-12-11 13:21:17 +01:00
ecb3b3a364 Ensure that command usage starts uppercase and ends period (#11278)
# Description

This repeats #8268 to make all command usage strings start with an
uppercase letter and end with a period per #5056

Adds a test to ensure that commands won't regress

Part of #5066

# User-Facing Changes

Command usage is now consistent

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

Automatic documentation updates
2023-12-10 08:28:54 -06:00
3e5f81ae14 Convert remainder of ShellError variants to named fields (#11276)
# Description

Removed variants that are no longer in use:
* `NoFile*`
* `UnexpectedAbbrComponent`

Converted:
* `OutsideSpannedLabeledError`
* `EvalBlockWithInput`
* `Break`
* `Continue`
* `Return`
* `NotAConstant`
* `NotAConstCommand`
* `NotAConstHelp`
* `InvalidGlobPattern`
* `ErrorExpandingGlob`

Fixes #10700 

# User-Facing Changes

None

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2023-12-09 18:46:21 -06:00
ca05553fc6 Simplify clear implementation (#11273)
# Description
This PR uses the `crossterm` api to reimplement `clear` command, since
`crossterm` is cross-platform.
This seems to work on linux and windows.

# User-Facing Changes
N/A

# Tests + Formatting
- [x] `cargo fmt --all -- --check` to check standard code formatting
(`cargo fmt --all` applies these changes)
- [x] `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used`
to check that you're using the standard code style
- [x] `cargo test --workspace` to check that all tests pass (on Windows
make sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- [x] `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

# After Submitting
N/A
2023-12-09 15:24:19 -06:00
fa5d7babb9 Fix replacement closures for update, insert, and upsert (#11258)
# Description
This PR addresses #11204 which points out that using a closure for the
replacement value with `update`, `insert`, or `upsert` does not work for
lists.

# User-Facing Changes
- Replacement closures should now work for lists in `upsert`, `insert`,
and `update`. E.g., `[0] | update 0 {|i| $i + 1 }` now gives `[1]`
instead of an unhelpful error.
- `[1 2] | insert 4 20` no longer works. Before, this would give `[1, 2,
null, null, 20]`, but now it gives an error. This was done to match the
intended behavior in `Value::insert_data_at_cell_path`, whereas the
behavior before was probably unintentional. Following
`Value::insert_data_at_cell_path`, inserting at the end of a list is
also fine, so the valid indices for `upsert` and `insert` are
`0..=length` just like `Vec::insert` or list inserts in other languages.

# Tests + Formatting
Added tests for `upsert`, `insert`, and `update`:
- Replacement closures for lists, list streams, records, and tables
- Other list stream tests
2023-12-09 15:22:45 -06:00
94b27267fd 🐛 Fixes markdown formatting on LSP hover (#11253)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
Hi! I was playing around and I fixed the formatting in the LSP hover. 
I _only tested in VS Code using Windows_, if anyone is capable, can you
test it on nvim or linux if it works properly? I think markdown
shouldn't have any problem

The link of the LSP meta issue just for reference #10941 
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

# User-Facing Changes
Now the LSP hovers markdown properly

![image](https://github.com/nushell/nushell/assets/30557287/7e824331-d9b1-40dd-957f-da77a21e97a2)

<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-12-08 12:30:13 -06:00
d1390ac95b Fix overlay_use_main_not_exported hanging when an external spam command exists (#11261)
# Description
The `spam` command is provided by
[opensp](https://openjade.sourceforge.net/) which causes
`overlay_use_main_not_exported` to hang. `opensp` is pulled by
`gnome-control-center` on my system.

[opensp package
list](https://archlinux.org/packages/extra/x86_64/opensp/files/)

```
opensp 1.5.2-10 File List

Package has 224 files and 14 directories.

[Back to Package](https://archlinux.org/packages/extra/x86_64/opensp/)

    usr/
    usr/bin/
    usr/bin/nsgmls
    usr/bin/onsgmls
    usr/bin/osgmlnorm
    usr/bin/ospam
    usr/bin/ospcat
    usr/bin/ospent
    usr/bin/osx
    usr/bin/sgml2xml
    usr/bin/sgmlnorm
    usr/bin/spam
    ...snip...
```

`cargo test` output
```
...snip...
test shell::pipeline::commands::internal::unlet_variable_in_parent_scope ... ok
test shell::pipeline::commands::internal::unlet_env_variable ... ok
test shell::pipeline::doesnt_break_on_utf8 ... ok
test shell::run_export_extern ... ok
test shell::run_script_that_looks_like_module ... ok
test shell::pipeline::commands::internal::can_process_one_row_from_internal_and_pipes_it_to_stdin_of_external ... ok
test shell::pipeline::commands::internal::variable_scoping::access_variables_in_scopes ... ok
test shell::run_in_login_mode ... ok
test shell::run_in_interactive_mode ... ok
test shell::run_in_noninteractive_mode ... ok
test shell::run_in_not_login_mode ... ok
test shell::pipeline::commands::internal::subexpression_properly_redirects ... ok
test shell::pipeline::commands::internal::subexpression_handles_dot ... ok
test shell::pipeline::commands::internal::takes_rows_of_nu_value_strings_and_pipes_it_to_stdin_of_external ... ok
test overlays::overlay_use_main_not_exported has been running for over 60 seconds
```
# User-Facing Changes
N/A

# Tests + Formatting
Make sure you've run and fixed any issues with these commands:

- [x] `cargo fmt --all -- --check` to check standard code formatting
(`cargo fmt --all` applies these changes)
- [x] `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used`
to check that you're using the standard code style
- [x] `cargo test --workspace` to check that all tests pass (on Windows
make sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- [x] `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library



# After Submitting
N/A
2023-12-08 06:08:38 -06:00
d717e8faeb Add nu lib dirs default (#11248)
# Description

This PR is kind of two PRs in one because they were dependent on each
other.

PR1 -
3de58d4dc2
with update
7fcdb242d9
- This follows our mantra of having everything with defaults written in
nushell rust code. So, that if you run without a config, you get the
same behavior as with the default config/env files. This sets
NU_LIB_DIRS to $nu.config-path/scripts and sets NU_PLUGIN_DIRS to
$nu.config-path/plugins.

PR2 -
0e8ac876fd
- The benchmarks have been broke for some time and we didn't notice it.
This PR fixes that. It's dependent on PR1 because it was throwing errors
because PWD needed to be set to a valid folder and `$nu` did not exist
based on how the benchmark was setup.

I've tested the benchmarks and they run without error now and I've also
launched nushell as `nu -n --no-std-lib` and the env vars exist.

closes #11236

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-12-07 08:13:50 -06:00
a95a4505ef Convert Shellerror::GenericError to named fields (#11230)
# Description

Replace `.to_string()` used in `GenericError` with `.into()` as
`.into()` seems more popular

Replace `Vec::new()` used in `GenericError` with `vec![]` as `vec![]`
seems more popular

(There are so, so many)
2023-12-07 00:40:03 +01:00
b03f1efac4 Upgrade lsp-server Dependency (#11252)
The lsp-server crate has been released and thus it is now possible to
depend on this version rather on the git dependency of the crate.
2023-12-06 17:19:03 -06:00
5d5088b5d5 Match ++= capabilities with ++ (#11130)
Allow `++=` to work in all situations `++` does, namely for appending
single elements: `$list ++= 1`.

Resolve #11087

# Description

Bring `++=` to parity with `++`.

# User-Facing Changes

It is now possible to do `$list ++= 1` (appending a single element).
Similarly, this can be done:

```Nushell
~> mut a = [1]
~> $a ++= 2
~> a
╭───┬───╮
│ 0 │ 1 │
│ 1 │ 2 │
╰───┴───╯
```

# Tests + Formatting

Added two tests:

- `commands::assignment::append_assign::append_assign_single_element`
- `commands::assignment::append_assign::append_assign_to_single_element`
2023-12-07 05:46:37 +08:00
c1c73811d5 fix nu-std README (#11244)
related to
-
https://github.com/nushell/nushell/issues/10676#issuecomment-1842472941
from @suimong

# Description
the command in the `README.md` of `nu-std` should use `scope commands`
instead of `help commands`, which return an empty list.

# User-Facing Changes

# Tests + Formatting

# After Submitting
2023-12-06 16:26:02 +01:00
51bf8d9f6a Remove unnecessary boxing of Stack::recursion_count (#11238) 2023-12-06 10:48:56 +02:00
858c93d2e5 Fix highlighting of spread subexpressions in records (#11202)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

It turns out that I left a bug in
[#11144](https://github.com/nushell/nushell/pull/11144/), which
introduced a spread operator in record literals. When highlighting
subexpressions that are spread inside records, the spread operator and
the token before it are insert twice. Currently, when you type `{ ...()
}`, this is what you'll see:


![image](https://github.com/nushell/nushell/assets/45539777/9a76647a-6bbe-426e-95bc-50becf2fa537)

With the PR, the behavior is as expected:


![image](https://github.com/nushell/nushell/assets/45539777/36bdab23-3252-4500-8317-51278da0e869)

I'm still not sure how `FlatShape` works, I just copied the existing
logic for flattening key-value pairs in records, so it's possible
there's still issues, but I haven't found any yet (tried spreading
subexpressions, variables, and records).

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

Highlighting for subexpressions spread inside records should no longer
be screwed up.

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

Is there any way to test flattening/syntax highlighting?

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-12-06 08:56:35 +08:00
31146a7591 Upgrading to polars 0.35 (#11241)
Co-authored-by: Jack Wright <jack.wright@disqo.com>
2023-12-05 18:09:34 -06:00
05d7d6d6ad Do not create help for wrapped command (#11235)
Pretty self-explanatory.  The commit is only one `if`.

Fix #11096
2023-12-05 13:04:36 -06:00
fb3350ebc3 Error on use path item1 item2, if item1 is not a module (#11183)
# Description
Fixes: #11143

# User-Facing Changes
Take the following as example:
```nushell
module foo { export def bar [] {}; export def baz [] {} }
```

`use foo bar baz` will be error:
```
❯ use foo c d
Error: nu::parser::wrong_import_pattern

  × Wrong import pattern structure.
   ╭─[entry #2:1:1]
 1 │ use foo c d
   ·           ┬
   ·           ╰── Trying to import something but the parent `c` is not a module, maybe you want to try `use <module> [<name1>, <name2>]`
   ╰────
```

# Tests + Formatting
Done
2023-12-05 11:38:45 +01:00
2ffe30ecf0 Respect non-zero exit code in subexpressions and blocks (#8984)
# Description

This PR changes the way we handled non-zero exit codes to be and early
exit between `foo; bar`. If `foo` in the example has a non-zero exit
code, `bar` wouldn't be run.

This also affects subexpressions.
2023-12-05 14:42:55 +08:00
f8dc3421b0 Bump actions-rust-lang/setup-rust-toolchain from 1.5.0 to 1.6.0 (#11223)
Bumps
[actions-rust-lang/setup-rust-toolchain](https://github.com/actions-rust-lang/setup-rust-toolchain)
from 1.5.0 to 1.6.0.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/actions-rust-lang/setup-rust-toolchain/blob/main/CHANGELOG.md">actions-rust-lang/setup-rust-toolchain's
changelog</a>.</em></p>
<blockquote>
<h2>[1.6.0] - 2023-12-04</h2>
<h3>Added</h3>
<ul>
<li>Allow disabling problem matchers (<a
href="https://redirect.github.com/actions-rust-lang/setup-rust-toolchain/issues/27">#27</a>)
This can be useful when having a matrix of jobs, that produce the same
errors.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c7e1de2846"><code>c7e1de2</code></a>
Update CHANGELOG.md</li>
<li><a
href="24c9dd087b"><code>24c9dd0</code></a>
Merge pull request <a
href="https://redirect.github.com/actions-rust-lang/setup-rust-toolchain/issues/24">#24</a>
from obi1kenobi/patch-1</li>
<li><a
href="74a4154991"><code>74a4154</code></a>
Merge pull request <a
href="https://redirect.github.com/actions-rust-lang/setup-rust-toolchain/issues/27">#27</a>
from oxideai/feature/config-matcher</li>
<li><a
href="84ba0c9d1b"><code>84ba0c9</code></a>
Update README</li>
<li><a
href="51173b3da4"><code>51173b3</code></a>
feature(matcher): allow disabling problem matcher</li>
<li><a
href="33678a48c0"><code>33678a4</code></a>
Add docs for the <code>cachekey</code> output to the README</li>
<li><a
href="317ed62323"><code>317ed62</code></a>
Update example workflow in readme</li>
<li><a
href="8cb8f77172"><code>8cb8f77</code></a>
Merge pull request <a
href="https://redirect.github.com/actions-rust-lang/setup-rust-toolchain/issues/23">#23</a>
from actions-rust-lang/dependabot/github_actions/actio...</li>
<li><a
href="1f541c5b05"><code>1f541c5</code></a>
Bump actions/checkout from 3 to 4</li>
<li>See full diff in <a
href="https://github.com/actions-rust-lang/setup-rust-toolchain/compare/v1.5.0...v1.6.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions-rust-lang/setup-rust-toolchain&package-manager=github_actions&previous-version=1.5.0&new-version=1.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-05 10:42:40 +08:00
c1a30ac60f Reduce code duplication in eval.rs and eval_const.rs (#11192) 2023-12-04 21:13:47 +02:00
fc06afd051 feat: Add default docs for aliases, generated from the command they point to (#10825) 2023-12-04 20:56:46 +02:00
c9aa6ba0f3 Add special error for calling metadata on $env and $nu (#11228)
Trying to call `metadata $env` or `metadata $nu` will throw an error:

```Nushell
~> metadata $nu                                                                                                                            
Error:   × Built-in variables `$env` and `$nu` have no metadata
   ╭─[entry #1:1:1]
 1 │ metadata $nu
   ·          ─┬─
   ·           ╰── no metadata available
   ╰────
```
2023-12-04 12:49:36 -06:00
f8c82588b6 Explicitly indicate duplicate flags (#11226)
# Description
This PR adds an explicit indication for duplicate flags, which helps
with debugging.

# User-Facing Changes
N/A

# Tests + Formatting
- [x] `cargo fmt --all -- --check` to check standard code formatting
(`cargo fmt --all` applies these changes)
- [x] `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used`
to check that you're using the standard code style
- [x] `cargo test --workspace` to check that all tests pass (on Windows
make sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- [x] `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

# After Submitting
N/A
2023-12-04 22:06:27 +08:00
67eec92e76 Convert more ShellError variants to named fields (#11222)
# Description

Convert errors to named fields:
* NeedsPositiveValue
* MissingConfigValue
* UnsupportedConfigValue
* DowncastNotPossible
* NonUtf8Custom
* NonUtf8
* DidYouMeanCustom
* DidYouMean
* ReadingFile
* RemoveNotPossible
* ChangedModifiedTimeNotPossible
* ChangedAccessTimeNotPossible

Part of #10700
2023-12-04 10:19:32 +01:00
b227eea668 Add checks for ports (#11214)
# Description
This PR adds checks for ports. This fixes unexpected output similar to
the one in the comment
https://github.com/nushell/nushell/pull/11210#issuecomment-1837152357.

* before

```console
/data/source/nushell> port 65536 99999                                                                                        
41233
```

* after

```console
/data/source/nushell> port 65536 99999                                                                                             
Error: nu:🐚:cant_convert

  × Can't convert to u16.
   ╭─[entry #1:1:1]
 1 │ port 65536 99999
   ·      ──┬──
   ·        ╰── can't convert usize to u16
   ╰────
  help: out of range integral type conversion attempted (min: 0, max: 65535)
```

# User-Facing Changes
N/A

# Tests + Formatting
* [x] add `port_out_of_range` test

# After Submitting
N/A
2023-12-03 08:07:15 -06:00
58d002d469 expose argv[0] as $env.PROCESS_PATH (#11203)
closes #11059 

# Description
I'm not sure what the consensus was after discussing this in discord, so
I'm creating a PR as suggested

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
TBD
# Tests + Formatting
TBD
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
TBD
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-12-02 11:36:02 -06:00
5d283755e3 Fix Option<&str> == Option<&String> w/ rust_decimal/rkyv feat (#11205)
Without this change, projects which depend on both nu-command and
rust_decimal's "rkyv" feature cause nu-command to fail to compile.

```toml
[dependencies]
nu-command = { path = "../nushell/crates/nu-command" }
rust_decimal = { version = "1", features = ["rkyv"] }
```

```console
error[E0277]: can't compare `std::option::Option<&str>` with `std::option::Option<&std::string::String>`
   --> nushell/crates/nu-command/src/filters/join.rs:367:35
    |
367 |         let k_shared = shared_key == Some(k);
    |                                   ^^ no implementation for `std::option::Option<&str> == std::option::Option<&std::string::String>`
    |
    = help: the trait `PartialEq<std::option::Option<&std::string::String>>` is not implemented for `std::option::Option<&str>`
    = help: the following other types implement trait `PartialEq<Rhs>`:
              <std::option::Option<Box<U>> as PartialEq<rkyv::niche::option_box::ArchivedOptionBox<T>>>
              <std::option::Option<T> as PartialEq>
              <std::option::Option<U> as PartialEq<rkyv::option::ArchivedOption<T>>>

For more information about this error, try `rustc --explain E0277`.
warning: `nu-command` (lib) generated 1 warning
error: could not compile `nu-command` (lib) due to previous error; 1 warning emitted
```
2023-12-02 18:19:15 +01:00
35e8db160d Fix get -i ignoring errors for only the first cellpath (#11213)
# Description
Fixes issue #11212 where only the first cellpath supplied to `get -i` is
treated as optional, and the rest of the cell paths are treated as
non-optional.

# Tests
Added one test.
2023-12-02 11:01:08 -06:00
7d8df4ba9e Fix capacity overflow caused by large range of ports (#11210)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

Try to fix capacity overflow caused by large range of ports. 

```
$ port 1024 999999999999999999                                                                                 12/02/23 20:03:14 PM
thread 'main' panicked at 'capacity overflow', library/alloc/src/raw_vec.rs:524:5
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
```

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-12-02 09:17:14 -06:00
f36c055c50 Fix span of invalid range (#11207)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

Try to improve the error message of invalid range.

* before 

![Screenshot from 2023-12-02
08-45-23](https://github.com/nushell/nushell/assets/15247421/4d4e3533-b6c6-42c4-9f59-d4d30e4ad5c2)


* after

![Screenshot from 2023-12-02
13-18-34](https://github.com/nushell/nushell/assets/15247421/d380dced-4b60-4b1a-9992-9e0727e22054)


# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-12-02 07:33:05 -06:00
76bdda1178 deprecate std testing (#11151)
# Description
this PR deprecates the `std testing` module in favor of Nupm.
the plan is to simply hide the module to the user but still use it
internally when running the tests so that
- users don't start to use this module and rather focus on Nupm
- devs don't have to install anything to run the tests locally, they can
just use `toolkit test stdlib` for instance

the deprecation message will be very similar to
https://github.com/nushell/nushell/pull/11097

> **Note**
> to demonstrate that the removal of such a command from the exposed
modules of `std` will be transparent and not require the user to install
anything, i have it [prepared in a branch based on this
PR](https://github.com/amtoine/nushell/compare/deprecate-std-testing...amtoine:nushell:hide-std-testing)
> running `toolkit test stdlib` will run the standard library tests
without an issue and yet `use std testing` won't work 👌

# User-Facing Changes
`std testing run-tests` will be removed in `0.90`

# Tests + Formatting

# After Submitting
2023-12-02 13:15:47 +01:00
5c07e82fc0 Remove list<any> -> list<any> (#11137)
this should
- close https://github.com/nushell/nushell/issues/11134

# Description
this is band-aid...
but it should address the issue in
https://github.com/nushell/nushell/issues/11134 until we have a better
long-term fix for this i/o types bug 😇

# User-Facing Changes
the following will now parse and run fine
```nushell
def get-initial-commit []: nothing -> string {
    ^git rev-list HEAD | lines | last
}
```

# Tests + Formatting

# After Submitting
2023-12-02 15:16:17 +08:00
6ea5bdcf47 Allow more types for input list (#11195)
`input list` now allows all types by using `into_string`.

Custom formatting logic for records was removed.

Allow ranges as an input types.

Also made the prompt check depend on option, so `input list ""` will
have an empty prompt, while `input list` does not.

Resolve #11181
2023-12-01 11:01:15 -06:00
15c7e1b725 Add OutOfBounds error (#11201)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

This is a continuation of #11190. Try to add `OutOfBounds` error. It
seems that `OutOfBounds` is more accurate than `InvalidRange`.


# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-12-01 08:56:06 -06:00
e5d9f405db add some tests for stor create (#11194)
# Description

This is a PR to start adding a few tests to the `stor` commands. It
refactors the `stor create` command so it's easier to write tests.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-30 15:00:47 -06:00
80881c75f9 When using redirection, if a command generates non-zero exit code, the script should stop running (#11191)
# Description
Fixes: #11153

To make sure scripts stop from running on non-zero exit code, we need to
invoke `might_consume_external_result` on
`PipelineData::ExternalStream`, so it can tell nushell if this command
exists with non-zero exit code.

And this pr also adjusts some test cases.

# User-Facing Changes
```nushell
^false out> /dev/null; print "ok"
```

After this pr, it shouldn't print ok.

# Tests + Formatting
Done
2023-11-30 18:52:02 +01:00
250ee62e16 Add boundary check for str index-of (#11190)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
This PR tries to fix #10774

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
* before

![Screenshot from 2023-11-30
20-18-56](https://github.com/nushell/nushell/assets/15247421/4559e114-0757-4a73-91e7-c7536a8ce5f1)

![Screenshot from 2023-11-30
20-19-23](https://github.com/nushell/nushell/assets/15247421/30ab1b5a-3ec3-48a8-ae76-65721b650fcf)

* after

![Screenshot from 2023-11-30
20-21-04](https://github.com/nushell/nushell/assets/15247421/f34fd276-dccf-4328-b9d2-bf368b5b3ae5)

![Screenshot from 2023-11-30
20-20-39](https://github.com/nushell/nushell/assets/15247421/653e47d8-8d68-4ef3-adef-0865179c4f9b)

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-30 08:13:10 -06:00
54d73748e4 Remove file I/O from tests that don't need it (#11182)
# Description

This PR implements modifications to command tests that write unnecessary
json and csv to disk then load it with open, by using nuon literals
instead.

- Fixes #7189



# User-Facing Changes
None

# Tests + Formatting
This only affects existing tests, which still pass.
2023-11-29 23:21:34 +01:00
d08e254d16 Fix spans passed to external_completer (#11008)
close #10973
2023-11-29 23:17:06 +01:00
7f771babb7 Bump ureq from 2.8.0 to 2.9.1 (#11163) 2023-11-29 17:48:18 +00:00
b9b9eaaca5 Bump openssl from 0.10.59 to 0.10.60 (#11176) 2023-11-29 17:47:15 +00:00
0303d709e6 Spread operator in record literals (#11144)
Goes towards implementing #10598, which asks for a spread operator in
lists, in records, and when calling commands (continuation of #11006,
which only implements it in lists)

# Description
This PR is for adding a spread operator that can be used when building
records. Additional functionality can be added later.

Changes:

- Previously, the `Expr::Record` variant held `(Expression, Expression)`
pairs. It now holds instances of an enum `RecordItem` (the name isn't
amazing) that allows either a key-value mapping or a spread operator.
- `...` will be treated as the spread operator when it appears before
`$`, `{`, or `(` inside records (no whitespace allowed in between) (not
implemented yet)
- The error message for duplicate columns now includes the column name
itself, because if two spread records are involved in such an error, you
can't tell which field was duplicated from the spans alone

`...` will still be treated as a normal string outside records, and even
in records, it is not treated as a spread operator when not followed
immediately by a `$`, `{`, or `(`.

# User-Facing Changes
Users will be able to use `...` when building records.

```
> let rec = { x: 1, ...{ a: 2 } }
> $rec
╭───┬───╮
│ x │ 1 │
│ a │ 2 │
╰───┴───╯
> { foo: bar, ...$rec, baz: blah }
╭─────┬──────╮
│ foo │ bar  │
│ x   │ 1    │
│ a   │ 2    │
│ baz │ blah │
╰─────┴──────╯
```
If you want to update a field of a record, you'll have to use `merge`
instead:
```
> { ...$rec, x: 5 }
Error: nu:🐚:column_defined_twice

  × Record field or table column used twice: x
   ╭─[entry #2:1:1]
 1 │  { ...$rec, x: 5 }
   ·       ──┬─  ┬
   ·         │   ╰── field redefined here
   ·         ╰── field first defined here
   ╰────
> $rec | merge { x: 5 }
╭───┬───╮
│ x │ 5 │
│ a │ 2 │
╰───┴───╯
```

# Tests + Formatting

# After Submitting
2023-11-29 18:31:31 +01:00
0e1322e6d6 Forbid reserved variable names for function arguments (#11169)
Works for all arguments and flags. Because the signature parsing doesn't
give the spans, it is flags the entire signature.

Also added a constant with reserved variable names.

Fix #11158.
2023-11-29 18:29:07 +01:00
e290fa0e68 Add stor family of commands (#11170)
# Description

This PR adds the `stor` family of commands. These commands are meant to
create, open, insert, update, delete, reset data in an in-memory sqlite
database. This is really an experiment to see how creatively we can use
an in-memory database.

```
Usage:
  > stor

Subcommands:
  stor create - Create a table in the in-memory sqlite database
  stor delete - Delete a table or specified rows in the in-memory sqlite database
  stor export - Export the in-memory sqlite database to a sqlite database file
  stor import - Import a sqlite database file into the in-memory sqlite database
  stor insert - Insert information into a specified table in the in-memory sqlite database
  stor open - Opens the in-memory sqlite database
  stor reset - Reset the in-memory database by dropping all tables
  stor update - Update information in a specified table in the in-memory sqlite database

Flags:
  -h, --help - Display the help message for this command

Input/output types:
  ╭─#─┬──input──┬─output─╮
  │ 0 │ nothing │ string │
  ╰───┴─────────┴────────╯
```

### Examples

## stor create
```nushell
❯ stor create --table-name nudb --columns {bool1: bool, int1: int, float1: float, str1: str, datetime1: datetime}
╭──────┬────────────────╮
│ nudb │ [list 0 items] │
╰──────┴────────────────╯
```
## stor insert
```nushell
❯ stor insert --table-name nudb --data-record {bool1: true, int1: 2, float1: 1.1, str1: fdncred, datetime1: 2023-04-17} 
╭──────┬───────────────╮
│ nudb │ [table 1 row] │
╰──────┴───────────────╯
```
## stor open
```nushell
❯ stor open | table -e 
╭──────┬────────────────────────────────────────────────────────────────────╮
│      │ ╭─#─┬id─┬bool1┬int1┬float1┬──str1───┬─────────datetime1──────────╮ │
│ nudb │ │ 0 │ 1 │   1 │  2 │ 1.10 │ fdncred │ 2023-04-17 00:00:00 +00:00 │ │
│      │ ╰───┴───┴─────┴────┴──────┴─────────┴────────────────────────────╯ │
╰──────┴────────────────────────────────────────────────────────────────────╯
```
## stor update
```nushell
❯ stor update --table-name nudb --update-record {str1: toby datetime1: 2021-04-17} --where-clause "bool1 = 1"
╭──────┬───────────────╮
│ nudb │ [table 1 row] │
╰──────┴───────────────╯
❯ stor open | table -e
╭──────┬─────────────────────────────────────────────────────────────────╮
│      │ ╭─#─┬id─┬bool1┬int1┬float1┬─str1─┬─────────datetime1──────────╮ │
│ nudb │ │ 0 │ 1 │   1 │  2 │ 1.10 │ toby │ 2021-04-17 00:00:00 +00:00 │ │
│      │ ╰───┴───┴─────┴────┴──────┴──────┴────────────────────────────╯ │
╰──────┴─────────────────────────────────────────────────────────────────╯
```
## insert another row
```nushell
❯ stor insert --table-name nudb --data-record {bool1: true, int1: 5, float1: 1.1, str1: fdncred, datetime1: 2023-04-17} 
╭──────┬────────────────╮
│ nudb │ [table 2 rows] │
╰──────┴────────────────╯
❯ stor open | table -e
╭──────┬────────────────────────────────────────────────────────────────────╮
│      │ ╭─#─┬id─┬bool1┬int1┬float1┬──str1───┬─────────datetime1──────────╮ │
│ nudb │ │ 0 │ 1 │   1 │  2 │ 1.10 │ toby    │ 2021-04-17 00:00:00 +00:00 │ │
│      │ │ 1 │ 2 │   1 │  5 │ 1.10 │ fdncred │ 2023-04-17 00:00:00 +00:00 │ │
│      │ ╰───┴───┴─────┴────┴──────┴─────────┴────────────────────────────╯ │
╰──────┴────────────────────────────────────────────────────────────────────╯
```
## stor delete (specific row(s))
```nushell
❯ stor delete --table-name nudb --where-clause "int1 == 5"
╭──────┬───────────────╮
│ nudb │ [table 1 row] │
╰──────┴───────────────╯
```
## insert multiple tables
```nushell
❯ stor create --table-name nudb1 --columns {bool1: bool, int1: int, float1: float, str1: str, datetime1: datetime}
╭───────┬────────────────╮
│ nudb  │ [table 1 row]  │
│ nudb1 │ [list 0 items] │
╰───────┴────────────────╯
❯ stor insert --table-name nudb1 --data-record {bool1: true, int1: 2, float1: 1.1, str1: fdncred, datetime1: 2023-04-17}
╭───────┬───────────────╮
│ nudb  │ [table 1 row] │
│ nudb1 │ [table 1 row] │
╰───────┴───────────────╯
❯ stor create --table-name nudb2 --columns {bool1: bool, int1: int, float1: float, str1: str, datetime1: datetime}
╭───────┬────────────────╮
│ nudb  │ [table 1 row]  │
│ nudb1 │ [table 1 row]  │
│ nudb2 │ [list 0 items] │
╰───────┴────────────────╯
❯ stor insert --table-name nudb2 --data-record {bool1: true, int1: 2, float1: 1.1, str1: fdncred, datetime1: 2023-04-17}
╭───────┬───────────────╮
│ nudb  │ [table 1 row] │
│ nudb1 │ [table 1 row] │
│ nudb2 │ [table 1 row] │
╰───────┴───────────────╯
```
## stor delete (specific table)
```nushell
❯ stor delete --table-name nudb1
╭───────┬───────────────╮
│ nudb  │ [table 1 row] │
│ nudb2 │ [table 1 row] │
╰───────┴───────────────╯
```
## stor reset (all tables are deleted)
```nushell
❯ stor reset
```
## stor export
```nushell
❯ stor export --file-name nudb.sqlite3
╭──────┬───────────────╮
│ nudb │ [table 1 row] │
╰──────┴───────────────╯
❯ open nudb.sqlite3 | table -e
╭──────┬────────────────────────────────────────────────────────────────────╮
│      │ ╭─#─┬id─┬bool1┬int1┬float1┬──str1───┬─────────datetime1──────────╮ │
│ nudb │ │ 0 │ 1 │   1 │  5 │ 1.10 │ fdncred │ 2023-04-17 00:00:00 +00:00 │ │
│      │ ╰───┴───┴─────┴────┴──────┴─────────┴────────────────────────────╯ │
╰──────┴────────────────────────────────────────────────────────────────────╯
❯ open nudb.sqlite3 | schema | table -e
╭────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│        │ ╭──────┬──────────────────────────────────────────────────────────────────────────────────────────────────╮ │
│ tables │ │      │ ╭───────────────┬──────────────────────────────────────────────────────────────────────────────╮ │ │
│        │ │ nudb │ │               │ ╭─#─┬─cid─┬───name────┬─────type─────┬─notnull─┬───────default────────┬─pk─╮ │ │ │
│        │ │      │ │ columns       │ │ 0 │ 0   │ id        │ INTEGER      │ 1       │                      │ 1  │ │ │ │
│        │ │      │ │               │ │ 1 │ 1   │ bool1     │ BOOLEAN      │ 0       │                      │ 0  │ │ │ │
│        │ │      │ │               │ │ 2 │ 2   │ int1      │ INTEGER      │ 0       │                      │ 0  │ │ │ │
│        │ │      │ │               │ │ 3 │ 3   │ float1    │ REAL         │ 0       │                      │ 0  │ │ │ │
│        │ │      │ │               │ │ 4 │ 4   │ str1      │ VARCHAR(255) │ 0       │                      │ 0  │ │ │ │
│        │ │      │ │               │ │ 5 │ 5   │ datetime1 │ DATETIME     │ 0       │ STRFTIME('%Y-%m-%d   │ 0  │ │ │ │
│        │ │      │ │               │ │   │     │           │              │         │ %H:%M:%f', 'NOW')    │    │ │ │ │
│        │ │      │ │               │ ╰─#─┴─cid─┴───name────┴─────type─────┴─notnull─┴───────default────────┴─pk─╯ │ │ │
│        │ │      │ │ constraints   │ [list 0 items]                                                               │ │ │
│        │ │      │ │ foreign_keys  │ [list 0 items]                                                               │ │ │
│        │ │      │ │ indexes       │ [list 0 items]                                                               │ │ │
│        │ │      │ ╰───────────────┴──────────────────────────────────────────────────────────────────────────────╯ │ │
│        │ ╰──────┴──────────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
```
## Using with `query db`
```nushell
❯ stor open | query db "select * from nudb"
╭─#─┬id─┬bool1┬int1┬float1┬──str1───┬─────────datetime1──────────╮
│ 0 │ 1 │   1 │  5 │ 1.10 │ fdncred │ 2023-04-17 00:00:00 +00:00 │
╰───┴───┴─────┴────┴──────┴─────────┴────────────────────────────╯
```
## stor import
```nushell
❯ stor open
# note, nothing is returned. there is nothing in memory, atm.
❯ stor import --file-name nudb.sqlite3
╭──────┬───────────────╮
│ nudb │ [table 1 row] │
╰──────┴───────────────╯
❯ stor open | table -e 
╭──────┬────────────────────────────────────────────────────────────────────╮
│      │ ╭─#─┬id─┬bool1┬int1┬float1┬──str1───┬─────────datetime1──────────╮ │
│ nudb │ │ 0 │ 1 │   1 │  5 │ 1.10 │ fdncred │ 2023-04-17 00:00:00 +00:00 │ │
│      │ ╰───┴───┴─────┴────┴──────┴─────────┴────────────────────────────╯ │
╰──────┴────────────────────────────────────────────────────────────────────╯
```

TODO:
- [x] `stor export` - Export a fully formed sqlite db file. 
- [x] `stor import` - Imports a specified sqlite db file.
- [x] Perhaps feature-gate it with the sqlite feature
- [x] Update `query db` to work with the in-memory database
- [x] Remove `open --in-memory`

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-29 08:02:46 -08:00
112306aab5 Revert "Adding support for Polars structs" (#11171)
Reverts nushell/nushell#10943

The current implementation of `arr_to_value` is unsound, as it allows
casts of arbitrary data to arbitrary types without being marked
`unsafe`.
The full safety requirements to perform both the cast and the following
unchecked access are not as clear that a simple change of `fn
arr_to_value` to `unsafe fn arr_to_value` could be blessed without
further investigation.

cc @ayax79
2023-11-29 16:33:27 +01:00
98952082ae Build nu-protocol docs with all features enabled (#11180)
# Description
Currently, `PluginSignature` does not appear on
[`docs.rs`](https://docs.rs/nu-protocol/latest/nu_protocol/index.html),
since it is behind the `plugin` feature which is not enabled by default.
This PR adds [metadata](https://docs.rs/about/metadata) to the
Cargo.toml to get `docs.rs` to build docs with all features enabled.
2023-11-29 16:17:22 +01:00
8386bc0919 Convert more ShellError variants to named fields (#11173)
# Description

Convert these ShellError variants to named fields:
* CreateNotPossible
* MoveNotPossibleSingle
* DirectoryNotFoundCustom
* DirectoryNotFound
* NotADirectory
* OutOfMemoryError
* PermissionDeniedError
* IOErrorSpanned
* IOError
* IOInterrupted

Also place the `span` field of `DirectoryNotFound` last to match other
errors.

Part of #10700 (almost half done!)

# User-Facing Changes

None

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2023-11-28 06:43:51 -06:00
182b0ab4fb add echo_env_mixed testbin to reduce windows only tests (#11172)
# Description
We have seen some test cases which requires to output message to both
stdout and stderr, especially in redirection scenario.

This pr is going to introduce a new echo_env_mixed testbin, so we can
have less tests which only runs on windows.

# User-Facing Changes
NaN

# Tests + Formatting
NaN

# After Submitting
NaN
2023-11-28 06:42:35 -06:00
fa83458a6d Add metadata to some filters (#11160)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
This PR preserves metadata when running some filters. As discussed on
discord that helps when running for example `ls | reject modified`
because it keeps colouring and links.
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-27 12:16:34 -06:00
54398f9546 Bump sysinfo to 0.29.11 (#11166)
I discovered [a condition in which `sysinfo` would
segfault](https://github.com/GuillaumeGomez/sysinfo/issues/1154),
thereby causing nushell to segfault when the `sys` command was run
(perhaps the cause of
[this](https://github.com/nushell/nushell/issues/10987)). The segfault
was fixed in `sysinfo`, I'm just bumping the version here.
2023-11-27 09:37:47 -06:00
077d1c8125 Support o>>, e>>, o+e>> to append output to an external file (#10764)
# Description
Close: #10278

This pr introduces `o>>`, `e>>`, `o+e>>` to allow redirection to append
to a file.
Examples:
```nushell
echo abc o>> a.txt
echo abc o>> a.txt
cat asdf e>> a.txt
cat asdf e>> a.txt
cat asdf o+e>> a.txt
```

~~TODO:~~
~~1. currently internal commands with `o+e>` redirect to a variable is
broken: `let x = "a.txt"; echo abc o+e> $x`, not sure when it was
introduced...~~
~~2. redirect stdout and stderr with append mode doesn't supported yet:
`cat asdf o>>a.txt e>>b.ext`~~

~~For these 2 items, I'd like to fix them in different prs.~~
Already done in this pr
2023-11-27 07:52:39 -06:00
1ff8c2d81d Cp target expansion (#11152)
# Description
This PR addresses issue with cp brough up on
[discord](https://discord.com/channels/601130461678272522/614593951969574961/1177669443917189130)
where target of cp is not correctly expanded.
If one has directory `test` with file `file.txt` in it then the
following command (in one line or inside a `do` block):
```nu
cd test; let file = 'copy.txt'; cp file.txt $file
```
will create a `copy.txt` in `.` not in `test` instead. This happens
because target of `cp` is a variable which is not expanded unlike a
string literal

# User-Facing Changes
`cp` will correctly parse realative target paths

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-25 09:42:20 -06:00
d77f1753c2 add shape ExternalResolved to show found externals via syntax highlighting in the repl (#11135)
# Description

This PR enables a new feature that shows which externals are found in
your path via the syntax highlighter as you type.

![external_resolved](https://github.com/nushell/nushell/assets/343840/e5fa91f0-6fac-485c-8afc-5711fc0ed9bc)

This idea could use some improvement where it caches the items in your
path and on some trigger, expires that cache and creates a new on. Right
now, all it does is call the `which` crate on every character you type.
This could be problematic if you have hundreds of paths in your PATH or
if some of your paths in your Path point to extraordinarily slow file
systems. WSL pointing to Windows comes to mind. Either way, I've thrown
it up here for people to try and provide feedback. I think the novelty
of showing what is valid and what isn't is pretty cool. I believe
fish-shell also does this, IIRC.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-25 09:42:05 -06:00
85c6047b71 fix the link to the nu_scripts in std clip deprecation (#11150)
this is just a simple fix for the link to the `nu_scripts` introduced in
https://github.com/nushell/nushell/pull/11097 😌
2023-11-24 19:03:07 +01:00
d37893cca0 Add more descriptive error message when passing list to from_csv (#10962)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
Added statement catching early List passed to CSV and printing more
helpful error message. This fixes #10081. Similar message might be
useful for other from_* calls but I'm not sure if there aren't any
converters accepting List as input.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-24 07:45:01 -06:00
95ac436d26 Fix release and nightly build workflow (#11146)
Fix release and nightly build workflow:
https://github.com/nushell/nightly/issues/15
I think the `mv` command in v0.87.1 still has some bugs, when downgrade
to v0.86 everything works as expected
2023-11-24 09:10:39 +08:00
c2a46070aa Exit the release job if creating binary package failed (#11145)
Try to exit the job if creating binary package failed, related issue:
https://github.com/nushell/nightly/issues/15
2023-11-24 07:47:59 +08:00
57808ca7cc Redirect: support redirect stderr with piping stdout to next commands. (#10851)
# Description
Fixes: #10271

Given the following script:
```shell
# test.sh
echo aaaaa
echo bbbbb 1>&2
echo cc
```

This pr makes the following command possible:
```nushell
bash test.sh err> /dev/null | lines | each {|line| $line | str length}
```


## General idea behind the change:
When nushell redirect stderr message to external file
1. it take stdout of external stream, and pass this stream to next
command, so it won't block next pipeline command from running.
2. relative stderr stream are handled by `save` command

These two streams are handled separately, so we need to delegate a
thread to `save` command, or else we'll have a chance to hang nushell,
we have meet a similar before: #5625.

### One case to consider
What if we're failed to save to an external stream? (Like we don't have
a permission to save to a file)?
In this case nushell will just print a waning message, and don't stop
the following scripts from running.

# User-Facing Changes
## Before
```nushell
❯ bash test2.sh err> /dev/null | lines | each {|line| $line | str length}
aaaaa
cc
```

## After
```nushell
❯ bash test2.sh err> /dev/null | lines | each {|line| $line | str length}
╭───┬───╮
│ 0 │ 5 │
│ 1 │ 2 │
╰───┴───╯
```

BTY, after this pr, the following commands are impossible either, it's
important to make sure that the implementation doesn't introduce too
much costs:
```nushell
❯ echo a e> a.txt e> a.txt
Error:   × Can't make stderr redirection twice
   ╭─[entry #1:1:1]
 1 │ echo a e> a.txt e> a.txt
   ·                 ─┬
   ·                  ╰── try to remove one
   ╰────

❯ echo a o> a.txt o> a.txt
Error:   × Can't make stdout redirection twice
   ╭─[entry #2:1:1]
 1 │ echo a o> a.txt o> a.txt
   ·                 ─┬
   ·                  ╰── try to remove one
   ╰────
```
2023-11-23 10:11:00 +08:00
6cfe35eb7e enable to pass switch values dynamically (#11057)
# Description
Closes: #7260 

About the change:
When we make an internalcall, and meet a `switch` (Flag.arg is None),
nushell will try to see if the switch is called like `--xyz=false` , if
that is true, `parse_long_flag` will return relative value.

# User-Facing Changes
So after the pr, the following would be possible:
```nushell
def build-imp [--push, --release] {
    echo $"Doing a build with push: ($push) and release: ($release)"
}
def build [--push, --release] {
    build-imp --push=$push --release=$release
}

build --push --release=false
build --push=false --release=true
build --push=false --release=false
build --push --release
build
```

# Tests + Formatting
Done

# After Submitting
Needs to submit a doc update, mentioned about the difference between
`def a [--x] {}` and `def a [--x: bool] {}`
2023-11-23 06:57:37 +08:00
b2734db015 Move more commands to opaque Record type (#11122)
# Description

Further work towards the goal that we can make `Record`'s field private
and experiment with different internal representations

## Details
- Use inplace record iter in `nu-command/math/utils`
  - Guarantee that existing allocation can be reused
- Use proper record iterators in `path join`
- Remove unnecesary hashmap in `path join`
  - Should minimally reduce the overhead
- Unzip records in `nu-command`
- Refactor `query web` plugin to use record APIs
- Use `Record::into_values` for `values` command
- Use `Record::columns()` in `join` instead.
  - Potential minor pessimisation
  - Not the hot value path
- Use sane `Record` iters in example `Debug` impl
- Avoid layout assumption in `nu-cmd-extra/roll/mod`
  - Potential minor pessimisation
- relegated to `extra`, changing the representation may otherwise break
this op.
- Use record api in `rotate`
- Minor risk that this surfaces some existing invalid behavior as panics
as we now validate column/value lengths
  - `extra` so things are unstable
- Remove unnecessary references in `rotate`
  - Bonus cleanup
# User-Facing Changes
None functional, minor potential differences in runtime. You win some,
you lose some.

# Tests + Formatting
Relying on existing tests
2023-11-22 23:48:48 +01:00
823e578c46 Spread operator for list literals (#11006) 2023-11-22 23:10:08 +02:00
95a745e622 deprecate std clip (#11097)
related to 
- https://github.com/nushell/nushell/issues/11041
- https://github.com/nushell/nu_scripts/pull/674

cc/ @FMotalleb 

# Description
reading the [frontpage of the standard
library](https://github.com/nushell/nushell/blob/main/crates/nu-std/README.md#--welcome-to-the-standard-library-of-nushell--)
and according to the last Nushell meeting, i has been agreed that `std
clip` does not belong to the standard library 😮
- it is not written in pure Nushell and requires external dependencies
which might not even work properly as in
https://github.com/nushell/nushell/issues/11041
- it is not a building block to build more complex applications

this PR deprecates the `std clip` command in favor of [`modules/system
clip`](https://github.com/nushell/nu_scripts/pull/674) for now.

the `std clip` command will be removed in Nushell 0.89.

# User-Facing Changes
the deprecation warning:

![std-clip-deprecation](https://github.com/nushell/nushell/assets/44101798/84bbdf3c-178c-4191-b0bf-9b1b25c229a2)
> **Note**
> the link has been changed to the `nu_scripts` in fa6c17da0 according
to the review comments

# Tests + Formatting

# After Submitting
this will have to be mentionned in the next release note, namely the
slight differences between the two commands.
2023-11-22 18:26:12 +01:00
776df7cd93 Convert PluginFailedToDecode to named fields (#11126)
# Description

Part of #10700

# User-Facing Changes

None

# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2023-11-22 12:56:04 +01:00
d5677625a7 Add is-terminal to determine if stdin/out/err are a terminal (#10970)
# Description

I'm not sure if "is-terminal" is the best name for this command as there
is also "term size". Uses
[`is_terminal()`](https://doc.rust-lang.org/stable/std/io/trait.IsTerminal.html#tymethod.is_terminal)
which is cross-platform.

Possible alternative names:
* `term is-tty --stdout`
* `term is-tty stdout`
* `term is-terminal stdout`

If multiple streams are provided an error is returned. The error span
covers all arguments as the incompatible one is not known. This may be
new?

Fixes #10517

# User-Facing Changes

* Add `is-terminal` to check if stdin, stdout, or stderr are a terminal
(TTY)

# Tests + Formatting

The nu tests always redirect stdin, stdout, and stderr so a positive
test case is not possible without extra work

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

The new command will be added automatically

---------

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
2023-11-21 20:48:39 -06:00
64288b4350 Convert PluginFailedToEncode to named fields (#11125)
# Description

Part of #10700

# User-Facing Changes

None

# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2023-11-21 17:38:58 -06:00
a42fd3611a Convert PluginFailedToLoad to named fields (#11124)
# Description

Part of #10700

# User-Facing Changes

None

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2023-11-21 17:30:52 -06:00
e36f69bf3c Convert FileNotFoundCustom to named fields (#11123)
# Description

Part of #10700

# User-Facing Changes

None

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2023-11-21 17:30:21 -06:00
5ad7b8f029 Fix toolkit to run workspace on 'check pr' (issue #10906) (#11112)
- Fixes #10906 

No rust changes, just tookit.nu. When `check pr` is run, it should now
run with the `--workspace` argument so all tests are run.
2023-11-21 18:19:35 +01:00
ffb80b8873 Curate developer documentation in tree (#11052)
This is still work in progress. But let's start somewhere.

Goal for this being in the repo is to make sure we update it more
frequently than the
https://github.com/nushell/nushell.github.io/tree/main/contributor-book


- Add folder for in-repo developer documentation
- Move PLATFORM_SUPPORT into devdocs
- Move our rust style to devdocs
- Use nushell code formatting in CONTRIBUTING
- Add FAQ file for developers with example questions
- Describe error handling best practices
2023-11-21 18:12:00 +01:00
177e800a07 Use record API in describe --detailed (#11075)
For the new stuff from #10795

Added helper to contract `{"type": ...}` records
Use inplace operations if possible for records/lists

Referencing #11027
2023-11-21 17:49:23 +01:00
a7e8970383 Upgrade Nu and script for release workflow (#11121)
Upgrade Nu and script for release workflow
2023-11-21 06:40:01 -06:00
a324a50bb7 Convert FileNotFound to named fields (#11120)
# Description

Part of #10700

# User-Facing Changes

None

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2023-11-21 17:08:10 +08:00
0578cf85ac Convert ShellError::AliasNotFound to named fields (#11118)
# Description

Part of #10700

# User-Facing Changes

None

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2023-11-21 16:24:08 +08:00
1b54dd5418 Remove ShellError::FlagNotFound (#11119)
# Description

`ShellError::FlagNotFound` had a note that said it may be removable so
this PR removes it instead of updating it to named fields per #10700

I can't see this error being used since it was introduced with #4364. I
can't find why or where it was used before that date, though. There was
a large merge with that PR but I can't penetrate the secrets of git to
find out where its earlier history went.

# User-Facing Changes

None

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2023-11-21 16:23:09 +08:00
8ad5d8bb6a Bump procfs to 0.16.0 (#11115)
Fix the breaking changes.
Get's rid of some outdated transitive dependencies.
Sadly we need to expose more of `procfs` to `nu-command` based on how
the features of `nu-system` are exposed right now.

Conditional compilation/dependencies from hell included

Supersedes #11101
2023-11-20 21:22:35 +01:00
869b01205c Bump uuid from 1.5.0 to 1.6.0 (#11104) 2023-11-20 19:38:41 +00:00
f5b2f5a9ee Bump winreg from 0.51.0 to 0.52.0 (#11102) 2023-11-20 19:35:21 +00:00
adfa4d00c0 Bump version to 0.87.2 (#11114)
Based on the hotfix
https://github.com/nushell/nushell/releases/tag/0.87.1 use this to
disambiguate
2023-11-20 20:31:10 +01:00
12effd9b4e Refactor Value cell path functions to fix bugs (#11066)
# Description
Slightly refactors the cell path functions (`insert_data_at_cell_path`,
etc.) for `Value` to fix a few bugs and ensure consistent behavior.
Namely, case (in)sensitivity now applies to lazy records just like it
does for regular `Records`. Also, the insert behavior of `insert` and
`upsert` now match, alongside fixing a few related bugs described below.
Otherwise, a few places were changed to use the `Record` API.

# Tests
Added tests for two bugs:
- `{a: {}} | insert a.b.c 0`: before this PR, doesn't create the
innermost record `c`.
- `{table: [[col]; [{a: 1}], [{a: 1}]]} | insert table.col.b 2`: before
this PR, doesn't add the field `b: 2` to each row.
2023-11-19 21:46:41 +01:00
c26fca7419 Add Argument::span() and Call::arguments_span() (#10983)
# Description

These make it easy to make a Span that covers an entire argument and the
span of all arguments in a Call.

Call::arguments_span() is useful for errors where a command may accept
arguments or the pipeline, but not both.

Argument::span() is useful for errors where an arguments is incompatible
with one or more other arguments.

In particular, I wish to use this to create an error for an
implementation of #9563 that either allows arguments to set limits:

```nushell
limits set RLIMIT_NOFILE --soft 255 --hard 1024
```

Or pipeline:

```nushell
{name: RLIMIT_NOFILE, soft: 255} | limits set
```

But not both:

```
❯ [{name: RLIMIT_NOFILE, soft: 255, hard: 1024}] | limits set AS --soft 5 --hard 5
Error: nu:🐚:incompatible_parameters

  × Incompatible parameters.
   ╭─[source:1:1]
 1 │ [{name: RLIMIT_NOFILE, soft: 255, hard: 1024}] | limits set AS --soft 5 --hard 5
   · ───────────────────────┬──────────────────────              ──────────┬─────────
   ·                        │                                              ╰── or arguments, not both
   ·                        ╰── Supply either pipeline
   ╰────
```

# User-Facing Changes

Only nushell Command API changes
2023-11-19 21:43:56 +01:00
da59dfe7d2 Convert ShellError::NetworkFailure to named fields (#11093)
# Description

Part of #10700
2023-11-19 21:32:11 +01:00
08715e6308 Convert ShellError::CommandNotFound to named fields (#11094)
# Description

Part of #10700
2023-11-19 21:31:28 +01:00
07d7899a97 remove def-env and export def-env (#10999)
follow-up to
- https://github.com/nushell/nushell/pull/10715

> **Important**
> wait for between 0.87 and 0.88 to land this

# Description
it's time for removal again 😋 
this PR removes `def-env` and `export def-env` in favor of `def --env`

# User-Facing Changes
`def-env` and `export def-env` will not be found anymore.

# Tests + Formatting

# After Submitting
2023-11-19 23:25:09 +08:00
494a5a5286 Add mktemp command (#11005)
closes #10845 

I've opened this a little prematurely to get some questions answered
before I cleanup the code.

As I started trying to better understand GNUs `mktemp` I've realized its
kind of peculiar and we might want to change its behavior to introduce
it to nushell.

#### quiet and dry run

Does it make sense to keep the `quiet` and `dry_run` flags? I don't
think so. The GNU documentation says this about the dry run flag "Using
the output of this command to create a new file is inherently unsafe, as
there is a window of time between generating the name and using it where
another process can create an object by the same name." So yeah why keep
it? As far as quiet goes, does it make sense to silence the errors in
nushell?

#### other confusing flags

According to the [gnu
docs](https://www.gnu.org/software/coreutils/manual/html_node/mktemp-invocation.html),
the `-t` flag is deprecated and the `-p`/ `--tempdir` are the same flag
with the only difference being `--tempdir` takes an optional path, Given
that, I've broken the `-p` away from `--tempdir`. Now there is one
switch `--tmpdir`/`-t` and one named param `--tmpdir-path`/`-p`.

GNU mktemp
```
  -p DIR, --tmpdir[=DIR]  interpret TEMPLATE relative to DIR; if DIR is not
                        specified, use $TMPDIR if set, else /tmp.  With
                        this option, TEMPLATE must not be an absolute name;
                        unlike with -t, TEMPLATE may contain slashes, but
                        mktemp creates only the final component
  -t                  interpret TEMPLATE as a single file name component,
                        relative to a directory: $TMPDIR, if set; else the
                        directory specified via -p; else /tmp [deprecated]

```
to
nushell mktemp
```
  -p, --tmpdir-path <Filepath> # named param, must provide a path
  -t, --tmpdir                 # a switch
```

Is this a terrible idea?

What should I do?

---------

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
2023-11-17 19:30:53 -06:00
f41c93b2d3 Apply nightly clippy fixes (#11083)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Clippy fixes for rust 1.76.0-nightly

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

N/A
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-17 09:15:55 -06:00
5063e01c12 std: add cross platform null-device name (#11070)
# Description
We have meet a discord discussion about cross platform `null-device`:
https://discord.com/channels/601130461678272522/988303282931912704/1165966467095875624

I want the same feature too...And I think it's good enough to add it
into `nu-std`.

Different to https://github.com/nushell/nu_scripts/pull/649/files, I
think named it to `null-device`(the name comes from
[wiki](https://en.wikipedia.org/wiki/Null_device)) is better than
`null-stream`.
2023-11-17 14:49:07 +01:00
d1137cc700 Send only absolute paths to uu_cp (#11080)
# Description
Fixes https://github.com/nushell/nushell/issues/10832

Replaces: https://github.com/nushell/nushell/pull/10843
2023-11-17 07:30:57 +08:00
3966c0a9fd Fix rm path handling (#11064)
# Description
Fixes issue #11061 where `rm` fails to find a file after a `cd`. It
looks like the new glob functions do not return absolute file paths
which we forgot to account for.

# Tests
Added a test (fails on current main, but passes with this PR).

---------

Co-authored-by: Jakub Žádník <kubouch@gmail.com>
2023-11-17 07:30:15 +08:00
dbdb1f6600 remove the unfold command (#10773)
follow-up to:
- https://github.com/nushell/nushell/pull/10771

> **Important**
> wait for between 0.87 and 0.88 to land this

# Description
after deprecation comes the removal... this PR removes `unfold` in favor
of `generate` 🥳

# User-Facing Changes
users should use `generate` now, `unfold` will stop working.

# Tests + Formatting

# After Submitting
2023-11-17 06:50:20 +08:00
84cdc0d521 remove size command in favor of str stats (#10784)
follow-up to
- https://github.com/nushell/nushell/pull/10798

> **Important**
> wait for between 0.87 and 0.88 to land this

# Description
once again, after deprecation comes removal 😌 

# User-Facing Changes
`size` is now removed and `str size` should be used

# Tests + Formatting

# After Submitting
2023-11-17 06:49:19 +08:00
ab59dab129 remove --not from glob (#10839)
follow-up to
- https://github.com/nushell/nushell/pull/10827

> **Important**  
> wait for between 0.87 and 0.88 to land this

# Description
after deprecation comes removal... this PR removes `glob --not` in favor
of `glob --exclude`.

# User-Facing Changes
`glob --not` will stop working.

# Tests + Formatting

# After Submitting
i didn't find any use of `glob --not` in the `nu_scripts` so no update
required there 👍
2023-11-17 06:46:15 +08:00
e0c8a3d14c remove extern-wrapped and export extern-wrapped (#11000)
follow-up to
- https://github.com/nushell/nushell/pull/10716

> **Important**
> wait for between 0.87 and 0.88 to land this

# Description
it's time for removal again 😋 
this PR removes `extern-wrapped` and `export extern-wrapped` in favor of
`def --wrapped`

# User-Facing Changes
`extern-wrapped` and `export extern-wrapped` will not be found anymore.

# Tests + Formatting

# After Submitting
2023-11-17 06:44:28 +08:00
e93e51d672 bump rust-toolchain to 1.72.1 (#11079)
# Description

This PR follows our process of staying 2 releases behind rust. 1.74.0
was released today so we update to 1.72.1.

Reference https://releases.rs/

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-16 15:14:45 -06:00
4205edbc70 Fix the output type for 'view files' (#11077)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->

Co-authored-by: JT <547158+jntrnr@users.noreply.github.com>
2023-11-16 11:53:51 -06:00
80bee40807 optimize/clean up a few of the table changes (#11076)
# Description

@sholderbach pointed out some places that I could help improve the code
in the table command changes. This PR tries to implement those.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-16 11:37:46 -06:00
461837773b correct table example syntax (#11074)
# Description

Correct an example that had old syntax.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-16 08:20:52 -06:00
52d4259f58 add "default" table theme (#11072)
# Description

This PR fixes a minor bug that prevented this command from running.
```nushell
table --list | each {|r| print ($r); print (ls | first 3 | table --theme $r)}
```
Here's the output now of the first few themes.

![image](https://github.com/nushell/nushell/assets/343840/21bc8942-5106-4b6a-8905-e90d6cb9a153)

It prevented it from running because "default" wasn't a real table
theme. Now "default" is a synonym of rounded.

Also tweaked the error message when a bad theme name is provided.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-16 06:14:18 -06:00
274a8366c6 tweak table example/parameter text (#11071)
# Description

This PR just tweaks the `table` example text and some parameter text.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-16 05:35:55 -06:00
a1dfc35968 Fix #11047 (#11054)
close #11047
2023-11-16 05:28:54 -06:00
5886a74ccc into binary -c: return 0 as single byte (#11068)
# Description

The `into binary` command has a `-c` flag which strips any leading 0s in
the most significant digits to represent the minimal number of bytes,
rather than the system's complete in-memory representation of the input.

However, currently giving 0 as input results in eight 0 bytes even with
the `-c` flag, which is inconsistent with the purpose of the flag.

```nu
❯ : 345678 | into binary
Length: 8 (0x8) bytes | printable whitespace ascii_other non_ascii
00000000:   4e 46 05 00  00 00 00 00                             NF•00000

❯ : 345678 | into binary -c
Length: 3 (0x3) bytes | printable whitespace ascii_other non_ascii
00000000:   4e 46 05

❯ : 0 | into binary
Length: 8 (0x8) bytes | printable whitespace ascii_other non_ascii
00000000:   00 00 00 00  00 00 00 00                             00000000

❯ : 0 | into binary -c
Length: 8 (0x8) bytes | printable whitespace ascii_other non_ascii
00000000:   00 00 00 00  00 00 00 00                             00000000
```

This change fixes this behavior so that if the entire input results in
all 0 bytes, only a single 0 byte is returned.

```nu
❯ : ~/src/nushell/target/aarch64-linux-android/debug/nu -c '0 | into binar
y -c'
Length: 1 (0x1) bytes | printable whitespace ascii_other non_ascii
00000000:   00
```

# User-Facing Changes

Values which result in all null bytes will be truncated to a single byte
when `-c` is given. This could potentially be considered a breaking
change if this behavior was relied upon in some way.
2023-11-16 04:09:31 -06:00
4367aa9f58 allow parsing of human readable datetimes (#11051)
# Description

This PR adds the ability to parse human readable datetime strings as
part of the `into datetime` command. I added a new `-n`/`--list-human`
parameter that produces this list to give the user an idea of what is
supported.
```nushell
❯ into datetime --list-human 
╭#─┬parseable human datetime examples┬───result───╮
│0 │Today 18:30                      │in 8 hours  │
│1 │2022-11-07 13:25:30              │a year ago  │
│2 │15:20 Friday                     │in 3 days   │
│3 │This Friday 17:00                │in 3 days   │
│4 │13:25, Next Tuesday              │in a week   │
│5 │Last Friday at 19:45             │3 days ago  │
│6 │In 3 days                        │in 2 days   │
│7 │In 2 hours                       │in 2 hours  │
│8 │10 hours and 5 minutes ago       │10 hours ago│
│9 │1 years ago                      │a year ago  │
│10│A year ago                       │a year ago  │
│11│A month ago                      │a month ago │
│12│A week ago                       │a week ago  │
│13│A day ago                        │a day ago   │
│14│An hour ago                      │an hour ago │
│15│A minute ago                     │a minute ago│
│16│A second ago                     │now         │
│17│Now                              │now         │
╰#─┴parseable human datetime examples┴───result───╯
```

Or with `$env.config.datetime_format.table` set.
```nushell
❯ into datetime --list-human 
╭#─┬parseable human datetime examples┬──────result───────╮
│0 │Today 18:30                      │11/14/23 06:30:00PM│
│1 │2022-11-07 13:25:30              │11/07/22 01:25:30PM│
│2 │15:20 Friday                     │11/17/23 03:20:00PM│
│3 │This Friday 17:00                │11/17/23 05:00:00PM│
│4 │13:25, Next Tuesday              │11/21/23 01:25:00PM│
│5 │Last Friday at 19:45             │11/10/23 07:45:00PM│
│6 │In 3 days                        │11/17/23 10:12:54AM│
│7 │In 2 hours                       │11/14/23 12:12:54PM│
│8 │10 hours and 5 minutes ago       │11/14/23 12:07:54AM│
│9 │1 years ago                      │11/13/22 10:12:54AM│
│10│A year ago                       │11/13/22 10:12:54AM│
│11│A month ago                      │10/15/23 11:12:54AM│
│12│A week ago                       │11/07/23 10:12:54AM│
│13│A day ago                        │11/13/23 10:12:54AM│
│14│An hour ago                      │11/14/23 09:12:54AM│
│15│A minute ago                     │11/14/23 10:11:54AM│
│16│A second ago                     │11/14/23 10:12:53AM│
│17│Now                              │11/14/23 10:12:54AM│
╰#─┴parseable human datetime examples┴──────result───────╯
```
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-15 17:43:37 -06:00
e9c298713e nu-table/ Add -t/theme argument && Replace -n/start-number with -i/index (#11058)
ref #11054

cc: @fdncred 

I've not figured out how to be able to have a flag option as `table -i`
:(

```nu
~/bin/nushell> [[a b, c]; [1 [2 3 3] 3] [4 5 [1 2 [1 2 3]]]] | table -e --width=80 --theme basic -i false

+---+-------+-----------+
| a |   b   |     c     |
+---+-------+-----------+
| 1 | +---+ |         3 |
|   | | 2 | |           |
|   | +---+ |           |
|   | | 3 | |           |
|   | +---+ |           |
|   | | 3 | |           |
|   | +---+ |           |
+---+-------+-----------+
| 4 |     5 | +-------+ |
|   |       | |     1 | |
|   |       | +-------+ |
|   |       | |     2 | |
|   |       | +-------+ |
|   |       | | +---+ | |
|   |       | | | 1 | | |
|   |       | | +---+ | |
|   |       | | | 2 | | |
|   |       | | +---+ | |
|   |       | | | 3 | | |
|   |       | | +---+ | |
|   |       | +-------+ |
+---+-------+-----------+
```

```nu
~/bin/nushell> [[a b, c]; [1 [2 3 3] 3] [4 5 [1 2 [1 2 3]]]] | table -e --width=80 --theme basic -i 100

+-----+---+-------------+-----------------------+
|   # | a |      b      |           c           |
+-----+---+-------------+-----------------------+
| 100 | 1 | +-----+---+ |                     3 |
|     |   | | 100 | 2 | |                       |
|     |   | +-----+---+ |                       |
|     |   | | 101 | 3 | |                       |
|     |   | +-----+---+ |                       |
|     |   | | 102 | 3 | |                       |
|     |   | +-----+---+ |                       |
+-----+---+-------------+-----------------------+
| 101 | 4 |           5 | +-----+-------------+ |
|     |   |             | | 100 |           1 | |
|     |   |             | +-----+-------------+ |
|     |   |             | | 101 |           2 | |
|     |   |             | +-----+-------------+ |
|     |   |             | | 102 | +-----+---+ | |
|     |   |             | |     | | 100 | 1 | | |
|     |   |             | |     | +-----+---+ | |
|     |   |             | |     | | 101 | 2 | | |
|     |   |             | |     | +-----+---+ | |
|     |   |             | |     | | 102 | 3 | | |
|     |   |             | |     | +-----+---+ | |
|     |   |             | +-----+-------------+ |
+-----+---+-------------+-----------------------+
```
2023-11-15 17:41:18 -06:00
c110ddff66 Implement LSP Text Document Synchronization (#10941) 2023-11-15 17:35:48 -06:00
a806717f35 Testing support tweaks: exit status in Outcome (#10692)
This PR makes a couple of tweaks to the testing support crate:

Add the `nu` invocation's exit status to the test output so that one
can assert that nu exited with a successful code.

This PR was split off of #10232.
2023-11-15 23:50:43 +01:00
2b5f1ee5b3 Bump version to 0.87.1 (#11056) 2023-11-15 23:50:11 +01:00
429 changed files with 12907 additions and 10041 deletions

View File

@ -41,7 +41,7 @@ jobs:
- uses: actions/checkout@v4
- name: Setup Rust toolchain and cache
uses: actions-rust-lang/setup-rust-toolchain@v1.5.0
uses: actions-rust-lang/setup-rust-toolchain@v1.6.0
with:
rustflags: ""
@ -85,7 +85,7 @@ jobs:
- uses: actions/checkout@v4
- name: Setup Rust toolchain and cache
uses: actions-rust-lang/setup-rust-toolchain@v1.5.0
uses: actions-rust-lang/setup-rust-toolchain@v1.6.0
with:
rustflags: ""
@ -106,7 +106,7 @@ jobs:
- uses: actions/checkout@v4
- name: Setup Rust toolchain and cache
uses: actions-rust-lang/setup-rust-toolchain@v1.5.0
uses: actions-rust-lang/setup-rust-toolchain@v1.6.0
with:
rustflags: ""
@ -141,7 +141,7 @@ jobs:
- uses: actions/checkout@v4
- name: Setup Rust toolchain and cache
uses: actions-rust-lang/setup-rust-toolchain@v1.5.0
uses: actions-rust-lang/setup-rust-toolchain@v1.6.0
with:
rustflags: ""

View File

@ -135,7 +135,7 @@ jobs:
echo "targets = ['${{matrix.target}}']" >> rust-toolchain.toml
- name: Setup Rust toolchain and cache
uses: actions-rust-lang/setup-rust-toolchain@v1.5.0
uses: actions-rust-lang/setup-rust-toolchain@v1.6.0
# WARN: Keep the rustflags to prevent from the winget submission error: `CAQuietExec: Error 0xc0000135`
with:
rustflags: ''
@ -249,7 +249,7 @@ jobs:
echo "targets = ['${{matrix.target}}']" >> rust-toolchain.toml
- name: Setup Rust toolchain and cache
uses: actions-rust-lang/setup-rust-toolchain@v1.5.0
uses: actions-rust-lang/setup-rust-toolchain@v1.6.0
# WARN: Keep the rustflags to prevent from the winget submission error: `CAQuietExec: Error 0xc0000135`
with:
rustflags: ''

View File

@ -138,8 +138,6 @@ print $'(char nl)Copying release files...'; hr-line
> register ./nu_plugin_query" | save $'($dist)/README.txt' -f
[LICENSE $executable] | each {|it| cp -rv $it $dist } | flatten
# Sleep a few seconds to make sure the cp process finished successfully
sleep 3sec
print $'(char nl)Check binary release version detail:'; hr-line
let ver = if $os == 'windows-latest' {
@ -148,7 +146,7 @@ let ver = if $os == 'windows-latest' {
(do -i { ./output/nu -c 'version' }) | str join
}
if ($ver | str trim | is-empty) {
print $'(ansi r)Incompatible nu binary...(ansi reset)'
print $'(ansi r)Incompatible Nu binary: The binary cross compiled is not runnable on current arch...(ansi reset)'
} else { print $ver }
# ----------------------------------------------------------------------------

View File

@ -79,7 +79,7 @@ jobs:
echo "targets = ['${{matrix.target}}']" >> rust-toolchain.toml
- name: Setup Rust toolchain and cache
uses: actions-rust-lang/setup-rust-toolchain@v1.5.0
uses: actions-rust-lang/setup-rust-toolchain@v1.6.0
# WARN: Keep the rustflags to prevent from the winget submission error: `CAQuietExec: Error 0xc0000135`
with:
rustflags: ''
@ -170,7 +170,7 @@ jobs:
echo "targets = ['${{matrix.target}}']" >> rust-toolchain.toml
- name: Setup Rust toolchain and cache
uses: actions-rust-lang/setup-rust-toolchain@v1.5.0
uses: actions-rust-lang/setup-rust-toolchain@v1.6.0
# WARN: Keep the rustflags to prevent from the winget submission error: `CAQuietExec: Error 0xc0000135`
with:
rustflags: ''

View File

@ -10,11 +10,16 @@ Welcome to Nushell and thank you for considering contributing!
- [Useful commands](#useful-commands)
- [Debugging tips](#debugging-tips)
- [Git etiquette](#git-etiquette)
- [Our Rust style](#our-rust-style)
- [Generally discouraged](#generally-discouraged)
- [Things we want to get better at](#things-we-want-to-get-better-at)
- [License](#license)
## Other helpful resources
More resources can be found in the nascent [developer documentation](devdocs/README.md) in this repo.
- [Developer FAQ](FAQ.md)
- [Platform support policy](PLATFORM_SUPPORT.md)
- [Our Rust style](devdocs/rust_style.md)
## Proposing design changes
First of all, before diving into the code, if you want to create a new feature, change something significantly, and especially if the change is user-facing, it is a good practice to first get an approval from the core team before starting to work on it.
@ -63,74 +68,74 @@ Read cargo's documentation for more details: https://doc.rust-lang.org/cargo/ref
- Build and run Nushell:
```shell
```nushell
cargo run
```
- Build and run with dataframe support.
```shell
```nushell
cargo run --features=dataframe
```
- Run Clippy on Nushell:
```shell
```nushell
cargo clippy --workspace -- -D warnings -D clippy::unwrap_used
```
or via the `toolkit.nu` command:
```shell
```nushell
use toolkit.nu clippy
clippy
```
- Run all tests:
```shell
```nushell
cargo test --workspace
```
along with dataframe tests
```shell
```nushell
cargo test --workspace --features=dataframe
```
or via the `toolkit.nu` command:
```shell
```nushell
use toolkit.nu test
test
```
- Run all tests for a specific command
```shell
```nushell
cargo test --package nu-cli --test main -- commands::<command_name_here>
```
- Check to see if there are code formatting issues
```shell
```nushell
cargo fmt --all -- --check
```
or via the `toolkit.nu` command:
```shell
```nushell
use toolkit.nu fmt
fmt --check
```
- Format the code in the project
```shell
```nushell
cargo fmt --all
```
or via the `toolkit.nu` command:
```shell
```nushell
use toolkit.nu fmt
fmt
```
- Set up `git` hooks to check formatting and run `clippy` before committing and pushing:
```shell
```nushell
use toolkit.nu setup-git-hooks
setup-git-hooks
```
@ -140,12 +145,12 @@ Read cargo's documentation for more details: https://doc.rust-lang.org/cargo/ref
- To view verbose logs when developing, enable the `trace` log level.
```shell
```nushell
cargo run --release -- --log-level trace
```
- To redirect trace logs to a file, enable the `--log-target file` switch.
```shell
```nushell
cargo run --release -- --log-level trace --log-target file
open $"($nu.temp-path)/nu-($nu.pid).log"
```
@ -237,51 +242,6 @@ You can help us to make the review process a smooth experience:
- Feel free to notify your reviewers or affected PR authors if your change might cause larger conflicts with another change.
- During the rollup of multiple PRs, we may choose to resolve merge conflicts and CI failures ourselves. (Allow maintainers to push to your branch to enable us to do this quickly.)
## Our Rust style
To make the collaboration on a project the scale of Nushell easy, we want to work towards a style of Rust code that can easily be understood by all of our contributors. We conservatively rely on most of [`clippy`s suggestions](https://github.com/rust-lang/rust-clippy) to get to the holy grail of "idiomatic" code. Good code in our eyes is not the most clever use of all available language features or with the most unique personal touch but readable and strikes a balance between being concise, and also unsurprising and explicit in the places where it matters.
One example of this philosophy is that we generally avoid to fight the borrow-checker in our data model but rather try to get to a correct and simple solution first and then figure out where we should reuse data to achieve the necessary performance. As we are still pre-1.0 this served us well to be able to quickly refactor or change larger parts of the code base.
### Generally discouraged
#### `+nightly` language features or things only available in the most recent `+stable`
To make life for the people easier that maintain the Nushell packages in various distributions with their own release cycle of `rustc` we typically rely on slightly older Rust versions. We do not make explicit guarantees how far back in the past we live but you can find out in our [`rust-toolchain.toml`](https://github.com/nushell/nushell/blob/main/rust-toolchain.toml)
(As a rule of thumb this has been typically been approximately 2 releases behind the newest stable compiler.)
The use of nightly features is prohibited.
#### Panicking
As Nushell aims to provide a reliable foundational way for folks to interact with their computer, we cannot carelessly crash the execution of their work by panicking Nushell.
Thus panicking is not an allowed error handling strategy for anything that could be triggered by user input OR behavior of the outside system. If Nushell panics this is a bug or we are against all odds already in an unrecoverable state (The system stopped cooperating, we went out of memory). The use of `.unwrap()` is thus outright banned and any uses of `.expect()` or related panicking macros like `unreachable!` should include a helpful description which assumptions have been violated.
#### `unsafe` code
For any use of `unsafe` code we need to require even higher standards and additional review. If you add or alter `unsafe` blocks you have to be familiar with the promises you need to uphold as found in the [Rustonomicon](https://doc.rust-lang.org/nomicon/intro.html). All `unsafe` uses should include `// SAFETY:` comments explaining how the invariants are upheld and thus alerting you what to watch out for when making a change.
##### FFI with system calls and the outside world
As a shell Nushell needs to interact with system APIs in several places, for which FFI code with unsafe blocks may be necessary. In some cases this can be handled by safe API wrapper crates but in some cases we may choose to directly do those calls.
If you do so you need to document the system behavior on top of the Rust memory model guarantees that you uphold. This means documenting whether using a particular system call is safe to use in a particular context and all failure cases are properly recovered.
##### Implementing self-contained data structures
Another motivation for reaching to `unsafe` code might be to try to implement a particular data structure that is not expressible on safe `std` library APIs. Doing so in the Nushell code base would have to clear a high bar for need based on profiling results. Also you should first do a survey of the [crate ecosystem](https://crates.io) that there doesn't exist a usable well vetted crate that already provides safe APIs to the desired datastructure.
##### Make things go faster by removing checks
This is probably a bad idea if you feel tempted to do so. Don't
#### Macros
Another advanced feature people feel tempted to use to work around perceived limitations of Rusts syntax and we are not particularly fans of are custom macros.
They have clear downsides not only in terms of readability if they locally introduce a different syntax. Most tooling apart from the compiler will struggle more with them. This limits for example consistent automatic formatting or automated refactors with `rust-analyzer`.
That you can fluently read `macro_rules!` is less likely than regular code. This can lead people to introduce funky behavior when using a macro. Be it because a macro is not following proper hygiene rules or because it introduces excessive work at compile time.
So we generally discourage the addition of macros. In a lot of cases your macro may start do something that can be expressed with functions or generics in a much more reusable fashion.
The only exceptions we may allow need to demonstrate that the macro can fix something that is otherwise extremely unreadable, error-prone, or consistently worse at compile time.
### Things we want to get better at
These are things we did pretty liberally to get Nushell off the ground, that make things harder for a high quality stable product. You may run across them but shouldn't take them as an endorsed example.
#### Liberal use of third-party dependencies
The amazing variety of crates on [crates.io](https://crates.io) allowed us to quickly get Nushell into a feature rich state but it left us with a bunch of baggage to clean up.
Each dependency introduces a compile time cost and duplicated code can add to the overall binary size. Also vetting more for correct and secure implementations takes unreasonably more time as this is also a continuous process of reacting to updates or potential vulnerabilities.
Thus we only want to accept dependencies that are essential and well tested implementations of a particular requirement of Nushells codebase.
Also as a project for the move to 1.0 we will try to unify among a set of dependencies if they possibly implement similar things in an area. We don't need three different crates with potentially perfect fit for three problems but rather one reliable crate with a maximized overlap between what it provides and what we need.
We will favor crates that are well tested and used and promise to be more stable and still frequently maintained.
#### Deeply nested code
As Nushell uses a lot of enums in its internal data representation there are a lot of `match` expressions. Combined with the need to handle a lot of edge cases and be defensive about any errors this has led to some absolutely hard to read deeply nested code (e.g. in the parser but also in the implementation of several commands).
This can be observed both as a "rightward drift" where the main part of the code is found after many levels of indentations or by long function bodies with several layers of branching with seemingly repeated branching inside the higher branch level.
This can also be exacerbated by "quick" bugfixes/enhancements that may just try to add a special case to catch a previously unexpected condition. The likelihood of introducing a bug in a sea of code duplication is high.
To combat this, consider using the early-`return` pattern to reject invalid data early in one place instead of building a tree through Rust's expression constructs with a lot of duplicated paths. Unpacking data into a type that expresses that the necessary things already have been checked and using functions to properly deal with separate and common behavior can also help.
## License
We use the [MIT License](https://github.com/nushell/nushell/blob/main/LICENSE) in all of our Nushell projects. If you are including or referencing a crate that uses the [GPL License](https://www.gnu.org/licenses/gpl-3.0.en.html#license-text) unfortunately we will not be able to accept your PR.

1449
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -10,8 +10,8 @@ homepage = "https://www.nushell.sh"
license = "MIT"
name = "nu"
repository = "https://github.com/nushell/nushell"
rust-version = "1.60"
version = "0.87.0"
rust-version = "1.72.1"
version = "0.88.1"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -47,29 +47,29 @@ members = [
]
[dependencies]
nu-cli = { path = "./crates/nu-cli", version = "0.87.0" }
nu-color-config = { path = "./crates/nu-color-config", version = "0.87.0" }
nu-cmd-base = { path = "./crates/nu-cmd-base", version = "0.87.0" }
nu-cmd-lang = { path = "./crates/nu-cmd-lang", version = "0.87.0" }
nu-cmd-dataframe = { path = "./crates/nu-cmd-dataframe", version = "0.87.0", features = ["dataframe"], optional = true }
nu-cmd-extra = { path = "./crates/nu-cmd-extra", version = "0.87.0", optional = true }
nu-command = { path = "./crates/nu-command", version = "0.87.0" }
nu-engine = { path = "./crates/nu-engine", version = "0.87.0" }
nu-explore = { path = "./crates/nu-explore", version = "0.87.0" }
nu-json = { path = "./crates/nu-json", version = "0.87.0" }
nu-lsp = { path = "./crates/nu-lsp/", version = "0.87.0" }
nu-parser = { path = "./crates/nu-parser", version = "0.87.0" }
nu-path = { path = "./crates/nu-path", version = "0.87.0" }
nu-plugin = { path = "./crates/nu-plugin", optional = true, version = "0.87.0" }
nu-pretty-hex = { path = "./crates/nu-pretty-hex", version = "0.87.0" }
nu-protocol = { path = "./crates/nu-protocol", version = "0.87.0" }
nu-system = { path = "./crates/nu-system", version = "0.87.0" }
nu-table = { path = "./crates/nu-table", version = "0.87.0" }
nu-term-grid = { path = "./crates/nu-term-grid", version = "0.87.0" }
nu-std = { path = "./crates/nu-std", version = "0.87.0" }
nu-utils = { path = "./crates/nu-utils", version = "0.87.0" }
nu-cli = { path = "./crates/nu-cli", version = "0.88.1" }
nu-color-config = { path = "./crates/nu-color-config", version = "0.88.1" }
nu-cmd-base = { path = "./crates/nu-cmd-base", version = "0.88.1" }
nu-cmd-lang = { path = "./crates/nu-cmd-lang", version = "0.88.1" }
nu-cmd-dataframe = { path = "./crates/nu-cmd-dataframe", version = "0.88.1", features = ["dataframe"], optional = true }
nu-cmd-extra = { path = "./crates/nu-cmd-extra", version = "0.88.1", optional = true }
nu-command = { path = "./crates/nu-command", version = "0.88.1" }
nu-engine = { path = "./crates/nu-engine", version = "0.88.1" }
nu-explore = { path = "./crates/nu-explore", version = "0.88.1" }
nu-json = { path = "./crates/nu-json", version = "0.88.1" }
nu-lsp = { path = "./crates/nu-lsp/", version = "0.88.1" }
nu-parser = { path = "./crates/nu-parser", version = "0.88.1" }
nu-path = { path = "./crates/nu-path", version = "0.88.1" }
nu-plugin = { path = "./crates/nu-plugin", optional = true, version = "0.88.1" }
nu-pretty-hex = { path = "./crates/nu-pretty-hex", version = "0.88.1" }
nu-protocol = { path = "./crates/nu-protocol", version = "0.88.1" }
nu-system = { path = "./crates/nu-system", version = "0.88.1" }
nu-table = { path = "./crates/nu-table", version = "0.88.1" }
nu-term-grid = { path = "./crates/nu-term-grid", version = "0.88.1" }
nu-std = { path = "./crates/nu-std", version = "0.88.1" }
nu-utils = { path = "./crates/nu-utils", version = "0.88.1" }
nu-ansi-term = "0.49.0"
reedline = { version = "0.26.0", features = ["bashisms", "sqlite"] }
reedline = { version = "0.27.0", features = ["bashisms", "sqlite"] }
crossterm = "0.27"
ctrlc = "3.4"
@ -97,7 +97,7 @@ nix = { version = "0.27", default-features = false, features = [
] }
[dev-dependencies]
nu-test-support = { path = "./crates/nu-test-support", version = "0.87.0" }
nu-test-support = { path = "./crates/nu-test-support", version = "0.88.1" }
assert_cmd = "2.0"
criterion = "0.5"
pretty_assertions = "1.4"

View File

@ -54,7 +54,7 @@ Detailed installation instructions can be found in the [installation chapter of
[![Packaging status](https://repology.org/badge/vertical-allrepos/nushell.svg)](https://repology.org/project/nushell/versions)
For details about which platforms the Nushell team actively supports, see [our platform support policy](PLATFORM_SUPPORT.md).
For details about which platforms the Nushell team actively supports, see [our platform support policy](devdocs/PLATFORM_SUPPORT.md).
## Configuration
@ -199,7 +199,7 @@ topics that have been presented.
Nu adheres closely to a set of goals that make up its design philosophy. As features are added, they are checked against these goals.
- First and foremost, Nu is cross-platform. Commands and techniques should work across platforms and Nu has [first-class support for Windows, macOS, and Linux](PLATFORM_SUPPORT.md).
- First and foremost, Nu is cross-platform. Commands and techniques should work across platforms and Nu has [first-class support for Windows, macOS, and Linux](devdocs/PLATFORM_SUPPORT.md).
- Nu ensures compatibility with existing platform-specific executables.

View File

@ -4,10 +4,35 @@ use nu_parser::parse;
use nu_plugin::{EncodingType, PluginResponse};
use nu_protocol::{engine::EngineState, PipelineData, Span, Value};
use nu_utils::{get_default_config, get_default_env};
use std::path::{Path, PathBuf};
fn load_bench_commands() -> EngineState {
nu_command::add_shell_command_context(nu_cmd_lang::create_default_context())
}
fn canonicalize_path(engine_state: &EngineState, path: &Path) -> PathBuf {
let cwd = engine_state.current_work_dir();
if path.exists() {
match nu_path::canonicalize_with(path, cwd) {
Ok(canon_path) => canon_path,
Err(_) => path.to_owned(),
}
} else {
path.to_owned()
}
}
fn get_home_path(engine_state: &EngineState) -> PathBuf {
let home_path = if let Some(path) = nu_path::home_dir() {
let canon_home_path = canonicalize_path(engine_state, &path);
canon_home_path
} else {
std::path::PathBuf::new()
};
home_path
}
// FIXME: All benchmarks live in this 1 file to speed up build times when benchmarking.
// When the *_benchmarks functions were in different files, `cargo bench` would build
// an executable for every single one - incredibly slowly. Would be nice to figure out
@ -15,10 +40,12 @@ fn load_bench_commands() -> EngineState {
fn parser_benchmarks(c: &mut Criterion) {
let mut engine_state = load_bench_commands();
// parsing config.nu breaks without PWD set
let home_path = get_home_path(&engine_state);
// parsing config.nu breaks without PWD set, so set a valid path
engine_state.add_env_var(
"PWD".into(),
Value::string("/some/dir".to_string(), Span::test_data()),
Value::string(home_path.to_string_lossy(), Span::test_data()),
);
let default_env = get_default_env().as_bytes();
@ -41,7 +68,6 @@ fn parser_benchmarks(c: &mut Criterion) {
c.bench_function("eval default_env.nu", |b| {
b.iter(|| {
let mut engine_state = load_bench_commands();
let mut stack = nu_protocol::engine::Stack::new();
eval_source(
&mut engine_state,
@ -56,12 +82,6 @@ fn parser_benchmarks(c: &mut Criterion) {
c.bench_function("eval default_config.nu", |b| {
b.iter(|| {
let mut engine_state = load_bench_commands();
// parsing config.nu breaks without PWD set
engine_state.add_env_var(
"PWD".into(),
Value::string("/some/dir".to_string(), Span::test_data()),
);
let mut stack = nu_protocol::engine::Stack::new();
eval_source(
&mut engine_state,
@ -76,9 +96,17 @@ fn parser_benchmarks(c: &mut Criterion) {
}
fn eval_benchmarks(c: &mut Criterion) {
let mut engine_state = load_bench_commands();
let home_path = get_home_path(&engine_state);
// parsing config.nu breaks without PWD set, so set a valid path
engine_state.add_env_var(
"PWD".into(),
Value::string(home_path.to_string_lossy(), Span::test_data()),
);
c.bench_function("eval default_env.nu", |b| {
b.iter(|| {
let mut engine_state = load_bench_commands();
let mut stack = nu_protocol::engine::Stack::new();
eval_source(
&mut engine_state,
@ -93,12 +121,6 @@ fn eval_benchmarks(c: &mut Criterion) {
c.bench_function("eval default_config.nu", |b| {
b.iter(|| {
let mut engine_state = load_bench_commands();
// parsing config.nu breaks without PWD set
engine_state.add_env_var(
"PWD".into(),
Value::string("/some/dir".to_string(), Span::test_data()),
);
let mut stack = nu_protocol::engine::Stack::new();
eval_source(
&mut engine_state,

View File

@ -5,27 +5,27 @@ repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cli"
edition = "2021"
license = "MIT"
name = "nu-cli"
version = "0.87.0"
version = "0.88.1"
[lib]
bench = false
[dev-dependencies]
nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.87.0" }
nu-command = { path = "../nu-command", version = "0.87.0" }
nu-test-support = { path = "../nu-test-support", version = "0.87.0" }
nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.88.1" }
nu-command = { path = "../nu-command", version = "0.88.1" }
nu-test-support = { path = "../nu-test-support", version = "0.88.1" }
rstest = { version = "0.18.1", default-features = false }
[dependencies]
nu-cmd-base = { path = "../nu-cmd-base", version = "0.87.0" }
nu-engine = { path = "../nu-engine", version = "0.87.0" }
nu-path = { path = "../nu-path", version = "0.87.0" }
nu-parser = { path = "../nu-parser", version = "0.87.0" }
nu-protocol = { path = "../nu-protocol", version = "0.87.0" }
nu-utils = { path = "../nu-utils", version = "0.87.0" }
nu-color-config = { path = "../nu-color-config", version = "0.87.0" }
nu-cmd-base = { path = "../nu-cmd-base", version = "0.88.1" }
nu-engine = { path = "../nu-engine", version = "0.88.1" }
nu-path = { path = "../nu-path", version = "0.88.1" }
nu-parser = { path = "../nu-parser", version = "0.88.1" }
nu-protocol = { path = "../nu-protocol", version = "0.88.1" }
nu-utils = { path = "../nu-utils", version = "0.88.1" }
nu-color-config = { path = "../nu-color-config", version = "0.88.1" }
nu-ansi-term = "0.49.0"
reedline = { version = "0.26.0", features = ["bashisms", "sqlite"] }
reedline = { version = "0.27.0", features = ["bashisms", "sqlite"] }
chrono = { default-features = false, features = ["std"], version = "0.4" }
crossterm = "0.27"
@ -39,7 +39,8 @@ percent-encoding = "2"
pathdiff = "0.2"
sysinfo = "0.29"
unicode-segmentation = "1.10"
uuid = { version = "1.5.0", features = ["v4"] }
uuid = { version = "1.6.0", features = ["v4"] }
which = "5.0.0"
[features]
plugin = []

View File

@ -107,7 +107,7 @@ impl Command for History {
)
})
})
.ok_or(ShellError::FileNotFound(head))?
.ok_or(ShellError::FileNotFound { span: head })?
.into_pipeline_data(ctrlc)),
HistoryFileFormat::Sqlite => Ok(history_reader
.and_then(|h| {
@ -119,12 +119,12 @@ impl Command for History {
create_history_record(idx, entry, long, head)
})
})
.ok_or(ShellError::FileNotFound(head))?
.ok_or(ShellError::FileNotFound { span: head })?
.into_pipeline_data(ctrlc)),
}
}
} else {
Err(ShellError::FileNotFound(head))
Err(ShellError::FileNotFound { span: head })
}
}

View File

@ -45,13 +45,13 @@ impl Command for KeybindingsListen {
Ok(v) => Ok(v.into_pipeline_data()),
Err(e) => {
terminal::disable_raw_mode()?;
Err(ShellError::GenericError(
"Error with input".to_string(),
"".to_string(),
None,
Some(e.to_string()),
Vec::new(),
))
Err(ShellError::GenericError {
error: "Error with input".into(),
msg: "".into(),
span: None,
help: Some(e.to_string()),
inner: vec![],
})
}
}
}

View File

@ -67,7 +67,7 @@ impl NuCompleter {
let mut callee_stack = stack.gather_captures(&self.engine_state, &block.captures);
// Line
if let Some(pos_arg) = block.signature.required_positional.get(0) {
if let Some(pos_arg) = block.signature.required_positional.first() {
if let Some(var_id) = pos_arg.var_id {
callee_stack.add_var(
var_id,
@ -122,11 +122,13 @@ impl NuCompleter {
for pipeline_element in pipeline.elements {
match pipeline_element {
PipelineElement::Expression(_, expr)
| PipelineElement::Redirection(_, _, expr)
| PipelineElement::Redirection(_, _, expr, _)
| PipelineElement::And(_, expr)
| PipelineElement::Or(_, expr)
| PipelineElement::SameTargetRedirection { cmd: (_, expr), .. }
| PipelineElement::SeparateRedirection { out: (_, expr), .. } => {
| PipelineElement::SeparateRedirection {
out: (_, expr, _), ..
} => {
let flattened: Vec<_> = flatten_expression(&working_set, &expr);
let mut spans: Vec<String> = vec![];
@ -141,18 +143,24 @@ impl NuCompleter {
let current_span = working_set.get_span_contents(flat.0).to_vec();
let current_span_str = String::from_utf8_lossy(&current_span);
let is_last_span = pos >= flat.0.start && pos < flat.0.end;
// Skip the last 'a' as span item
if flat_idx == flattened.len() - 1 {
let mut chars = current_span_str.chars();
chars.next_back();
let current_span_str = chars.as_str().to_owned();
spans.push(current_span_str.to_string());
if is_last_span {
let offset = pos - flat.0.start;
if offset == 0 {
spans.push(String::new())
} else {
let mut current_span_str = current_span_str.to_string();
current_span_str.remove(offset);
spans.push(current_span_str);
}
} else {
spans.push(current_span_str.to_string());
}
// Complete based on the last span
if pos >= flat.0.start && pos < flat.0.end {
if is_last_span {
// Context variables
let most_left_var =
most_left_variable(flat_idx, &working_set, flattened.clone());

View File

@ -35,10 +35,10 @@ pub fn evaluate_file(
let working_set = StateWorkingSet::new(engine_state);
report_error(
&working_set,
&ShellError::FileNotFoundCustom(
format!("Could not access file '{}': {:?}", path, e.to_string()),
Span::unknown(),
),
&ShellError::FileNotFoundCustom {
msg: format!("Could not access file '{}': {:?}", path, e.to_string()),
span: Span::unknown(),
},
);
std::process::exit(1);
});
@ -47,13 +47,13 @@ pub fn evaluate_file(
let working_set = StateWorkingSet::new(engine_state);
report_error(
&working_set,
&ShellError::NonUtf8Custom(
format!(
&ShellError::NonUtf8Custom {
msg: format!(
"Input file name '{}' is not valid UTF8",
file_path.to_string_lossy()
),
Span::unknown(),
),
span: Span::unknown(),
},
);
std::process::exit(1);
});
@ -64,14 +64,14 @@ pub fn evaluate_file(
let working_set = StateWorkingSet::new(engine_state);
report_error(
&working_set,
&ShellError::FileNotFoundCustom(
format!(
&ShellError::FileNotFoundCustom {
msg: format!(
"Could not read file '{}': {:?}",
file_path_str,
e.to_string()
),
Span::unknown(),
),
span: Span::unknown(),
},
);
std::process::exit(1);
});
@ -82,10 +82,10 @@ pub fn evaluate_file(
let working_set = StateWorkingSet::new(engine_state);
report_error(
&working_set,
&ShellError::FileNotFoundCustom(
format!("The file path '{file_path_str}' does not have a parent"),
Span::unknown(),
),
&ShellError::FileNotFoundCustom {
msg: format!("The file path '{file_path_str}' does not have a parent"),
span: Span::unknown(),
},
);
std::process::exit(1);
});
@ -98,6 +98,10 @@ pub fn evaluate_file(
"CURRENT_FILE".to_string(),
Value::string(file_path.to_string_lossy(), Span::unknown()),
);
stack.add_env_var(
"PROCESS_PATH".to_string(),
Value::string(path, Span::unknown()),
);
let source_filename = file_path
.file_name()
@ -135,7 +139,7 @@ pub fn evaluate_file(
false,
);
let pipeline_data = match pipeline_data {
Err(ShellError::Return(_, _)) => {
Err(ShellError::Return { .. }) => {
// allows early exists before `main` is run.
return Ok(());
}

View File

@ -139,18 +139,18 @@ fn add_menu(
"columnar" => add_columnar_menu(line_editor, menu, engine_state, stack, config),
"list" => add_list_menu(line_editor, menu, engine_state, stack, config),
"description" => add_description_menu(line_editor, menu, engine_state, stack, config),
_ => Err(ShellError::UnsupportedConfigValue(
"columnar, list or description".to_string(),
menu.menu_type.into_abbreviated_string(config),
menu.menu_type.span(),
)),
_ => Err(ShellError::UnsupportedConfigValue {
expected: "columnar, list or description".to_string(),
value: menu.menu_type.into_abbreviated_string(config),
span: menu.menu_type.span(),
}),
}
} else {
Err(ShellError::UnsupportedConfigValue(
"only record type".to_string(),
menu.menu_type.into_abbreviated_string(config),
menu.menu_type.span(),
))
Err(ShellError::UnsupportedConfigValue {
expected: "only record type".to_string(),
value: menu.menu_type.into_abbreviated_string(config),
span: menu.menu_type.span(),
})
}
}
@ -261,11 +261,11 @@ pub(crate) fn add_columnar_menu(
completer: Box::new(menu_completer),
}))
}
_ => Err(ShellError::UnsupportedConfigValue(
"block or omitted value".to_string(),
menu.source.into_abbreviated_string(config),
_ => Err(ShellError::UnsupportedConfigValue {
expected: "block or omitted value".to_string(),
value: menu.source.into_abbreviated_string(config),
span,
)),
}),
}
}
@ -343,11 +343,11 @@ pub(crate) fn add_list_menu(
completer: Box::new(menu_completer),
}))
}
_ => Err(ShellError::UnsupportedConfigValue(
"block or omitted value".to_string(),
menu.source.into_abbreviated_string(config),
menu.source.span(),
)),
_ => Err(ShellError::UnsupportedConfigValue {
expected: "block or omitted value".to_string(),
value: menu.source.into_abbreviated_string(config),
span: menu.source.span(),
}),
}
}
@ -461,11 +461,11 @@ pub(crate) fn add_description_menu(
completer: Box::new(menu_completer),
}))
}
_ => Err(ShellError::UnsupportedConfigValue(
"closure or omitted value".to_string(),
menu.source.into_abbreviated_string(config),
menu.source.span(),
)),
_ => Err(ShellError::UnsupportedConfigValue {
expected: "closure or omitted value".to_string(),
value: menu.source.into_abbreviated_string(config),
span: menu.source.span(),
}),
}
}
@ -580,11 +580,11 @@ fn add_keybinding(
"emacs" => add_parsed_keybinding(emacs_keybindings, keybinding, config),
"vi_insert" => add_parsed_keybinding(insert_keybindings, keybinding, config),
"vi_normal" => add_parsed_keybinding(normal_keybindings, keybinding, config),
m => Err(ShellError::UnsupportedConfigValue(
"emacs, vi_insert or vi_normal".to_string(),
m.to_string(),
m => Err(ShellError::UnsupportedConfigValue {
expected: "emacs, vi_insert or vi_normal".to_string(),
value: m.to_string(),
span,
)),
}),
},
Value::List { vals, .. } => {
for inner_mode in vals {
@ -600,11 +600,11 @@ fn add_keybinding(
Ok(())
}
v => Err(ShellError::UnsupportedConfigValue(
"string or list of strings".to_string(),
v.into_abbreviated_string(config),
v.span(),
)),
v => Err(ShellError::UnsupportedConfigValue {
expected: "string or list of strings".to_string(),
value: v.into_abbreviated_string(config),
span: v.span(),
}),
}
}
@ -630,11 +630,11 @@ fn add_parsed_keybinding(
KeyModifiers::CONTROL | KeyModifiers::ALT | KeyModifiers::SHIFT
}
_ => {
return Err(ShellError::UnsupportedConfigValue(
"CONTROL, SHIFT, ALT or NONE".to_string(),
keybinding.modifier.into_abbreviated_string(config),
keybinding.modifier.span(),
))
return Err(ShellError::UnsupportedConfigValue {
expected: "CONTROL, SHIFT, ALT or NONE".to_string(),
value: keybinding.modifier.into_abbreviated_string(config),
span: keybinding.modifier.span(),
})
}
};
@ -654,11 +654,11 @@ fn add_parsed_keybinding(
let char = if let (Some(char), None) = (pos1, pos2) {
char
} else {
return Err(ShellError::UnsupportedConfigValue(
"char_<CHAR: unicode codepoint>".to_string(),
c.to_string(),
keybinding.keycode.span(),
));
return Err(ShellError::UnsupportedConfigValue {
expected: "char_<CHAR: unicode codepoint>".to_string(),
value: c.to_string(),
span: keybinding.keycode.span(),
});
};
KeyCode::Char(char)
@ -681,21 +681,21 @@ fn add_parsed_keybinding(
.parse()
.ok()
.filter(|num| matches!(num, 1..=20))
.ok_or(ShellError::UnsupportedConfigValue(
"(f1|f2|...|f20)".to_string(),
format!("unknown function key: {c}"),
keybinding.keycode.span(),
))?;
.ok_or(ShellError::UnsupportedConfigValue {
expected: "(f1|f2|...|f20)".to_string(),
value: format!("unknown function key: {c}"),
span: keybinding.keycode.span(),
})?;
KeyCode::F(fn_num)
}
"null" => KeyCode::Null,
"esc" | "escape" => KeyCode::Esc,
_ => {
return Err(ShellError::UnsupportedConfigValue(
"crossterm KeyCode".to_string(),
keybinding.keycode.into_abbreviated_string(config),
keybinding.keycode.span(),
))
return Err(ShellError::UnsupportedConfigValue {
expected: "crossterm KeyCode".to_string(),
value: keybinding.keycode.into_abbreviated_string(config),
span: keybinding.keycode.span(),
})
}
};
if let Some(event) = parse_event(&keybinding.event, config)? {
@ -719,7 +719,10 @@ impl<'config> EventType<'config> {
.map(Self::Send)
.or_else(|_| extract_value("edit", record, span).map(Self::Edit))
.or_else(|_| extract_value("until", record, span).map(Self::Until))
.map_err(|_| ShellError::MissingConfigValue("send, edit or until".to_string(), span))
.map_err(|_| ShellError::MissingConfigValue {
missing_value: "send, edit or until".to_string(),
span,
})
}
}
@ -749,11 +752,11 @@ fn parse_event(value: &Value, config: &Config) -> Result<Option<ReedlineEvent>,
.iter()
.map(|value| match parse_event(value, config) {
Ok(inner) => match inner {
None => Err(ShellError::UnsupportedConfigValue(
"List containing valid events".to_string(),
"Nothing value (null)".to_string(),
value.span(),
)),
None => Err(ShellError::UnsupportedConfigValue {
expected: "List containing valid events".to_string(),
value: "Nothing value (null)".to_string(),
span: value.span(),
}),
Some(event) => Ok(event),
},
Err(e) => Err(e),
@ -762,11 +765,11 @@ fn parse_event(value: &Value, config: &Config) -> Result<Option<ReedlineEvent>,
Ok(Some(ReedlineEvent::UntilFound(events)))
}
v => Err(ShellError::UnsupportedConfigValue(
"list of events".to_string(),
v.into_abbreviated_string(config),
v.span(),
)),
v => Err(ShellError::UnsupportedConfigValue {
expected: "list of events".to_string(),
value: v.into_abbreviated_string(config),
span: v.span(),
}),
},
},
Value::List { vals, .. } => {
@ -774,11 +777,11 @@ fn parse_event(value: &Value, config: &Config) -> Result<Option<ReedlineEvent>,
.iter()
.map(|value| match parse_event(value, config) {
Ok(inner) => match inner {
None => Err(ShellError::UnsupportedConfigValue(
"List containing valid events".to_string(),
"Nothing value (null)".to_string(),
value.span(),
)),
None => Err(ShellError::UnsupportedConfigValue {
expected: "List containing valid events".to_string(),
value: "Nothing value (null)".to_string(),
span: value.span(),
}),
Some(event) => Ok(event),
},
Err(e) => Err(e),
@ -788,11 +791,11 @@ fn parse_event(value: &Value, config: &Config) -> Result<Option<ReedlineEvent>,
Ok(Some(ReedlineEvent::Multiple(events)))
}
Value::Nothing { .. } => Ok(None),
v => Err(ShellError::UnsupportedConfigValue(
"record or list of records, null to unbind key".to_string(),
v.into_abbreviated_string(config),
v.span(),
)),
v => Err(ShellError::UnsupportedConfigValue {
expected: "record or list of records, null to unbind key".to_string(),
value: v.into_abbreviated_string(config),
span: v.span(),
}),
}
}
@ -840,11 +843,11 @@ fn event_from_record(
ReedlineEvent::ExecuteHostCommand(cmd.into_string("", config))
}
v => {
return Err(ShellError::UnsupportedConfigValue(
"Reedline event".to_string(),
v.to_string(),
return Err(ShellError::UnsupportedConfigValue {
expected: "Reedline event".to_string(),
value: v.to_string(),
span,
))
})
}
};
@ -954,11 +957,11 @@ fn edit_from_record(
}
"complete" => EditCommand::Complete,
e => {
return Err(ShellError::UnsupportedConfigValue(
"reedline EditCommand".to_string(),
e.to_string(),
return Err(ShellError::UnsupportedConfigValue {
expected: "reedline EditCommand".to_string(),
value: e.to_string(),
span,
))
})
}
};
@ -971,7 +974,10 @@ fn extract_char(value: &Value, config: &Config) -> Result<char, ShellError> {
.into_string("", config)
.chars()
.next()
.ok_or_else(|| ShellError::MissingConfigValue("char to insert".to_string(), span))
.ok_or_else(|| ShellError::MissingConfigValue {
missing_value: "char to insert".to_string(),
span,
})
}
#[cfg(test)]
@ -1101,6 +1107,6 @@ mod test {
let span = Span::test_data();
let b = EventType::try_from_record(&event, span);
assert!(matches!(b, Err(ShellError::MissingConfigValue(_, _))));
assert!(matches!(b, Err(ShellError::MissingConfigValue { .. })));
}
}

View File

@ -502,10 +502,10 @@ pub fn evaluate_repl(
report_error(
&working_set,
&ShellError::DirectoryNotFound(
tokens.0[0].span,
path.to_string_lossy().to_string(),
),
&ShellError::DirectoryNotFound {
dir: path.to_string_lossy().to_string(),
span: tokens.0[0].span,
},
);
}
let path = nu_path::canonicalize_with(path, &cwd)
@ -773,23 +773,21 @@ pub fn get_command_finished_marker(stack: &Stack, engine_state: &EngineState) ->
}
fn run_ansi_sequence(seq: &str) -> Result<(), ShellError> {
io::stdout().write_all(seq.as_bytes()).map_err(|e| {
ShellError::GenericError(
"Error writing ansi sequence".into(),
e.to_string(),
Some(Span::unknown()),
None,
Vec::new(),
)
io::stdout()
.write_all(seq.as_bytes())
.map_err(|e| ShellError::GenericError {
error: "Error writing ansi sequence".into(),
msg: e.to_string(),
span: Some(Span::unknown()),
help: None,
inner: vec![],
})?;
io::stdout().flush().map_err(|e| {
ShellError::GenericError(
"Error flushing stdio".into(),
e.to_string(),
Some(Span::unknown()),
None,
Vec::new(),
)
io::stdout().flush().map_err(|e| ShellError::GenericError {
error: "Error flushing stdio".into(),
msg: e.to_string(),
span: Some(Span::unknown()),
help: None,
inner: vec![],
})
}

View File

@ -2,7 +2,7 @@ use log::trace;
use nu_ansi_term::Style;
use nu_color_config::{get_matching_brackets_style, get_shape_color};
use nu_parser::{flatten_block, parse, FlatShape};
use nu_protocol::ast::{Argument, Block, Expr, Expression, PipelineElement};
use nu_protocol::ast::{Argument, Block, Expr, Expression, PipelineElement, RecordItem};
use nu_protocol::engine::{EngineState, StateWorkingSet};
use nu_protocol::{Config, Span};
use reedline::{Highlighter, StyledText};
@ -17,10 +17,27 @@ impl Highlighter for NuHighlighter {
fn highlight(&self, line: &str, _cursor: usize) -> StyledText {
trace!("highlighting: {}", line);
let highlight_resolved_externals =
self.engine_state.get_config().highlight_resolved_externals;
let mut working_set = StateWorkingSet::new(&self.engine_state);
let block = parse(&mut working_set, None, line.as_bytes(), false);
let (shapes, global_span_offset) = {
let shapes = flatten_block(&working_set, &block);
let mut shapes = flatten_block(&working_set, &block);
// Highlighting externals has a config point because of concerns that using which to resolve
// externals may slow down things too much.
if highlight_resolved_externals {
for (span, shape) in shapes.iter_mut() {
if *shape == FlatShape::External {
let str_contents =
working_set.get_span_contents(Span::new(span.start, span.end));
let str_word = String::from_utf8_lossy(str_contents).to_string();
if which::which(str_word).ok().is_some() {
*shape = FlatShape::ExternalResolved;
}
}
}
}
(shapes, self.engine_state.next_span_start())
};
@ -91,6 +108,7 @@ impl Highlighter for NuHighlighter {
FlatShape::InternalCall(_) => add_colored_token(&shape.1, next_token),
FlatShape::External => add_colored_token(&shape.1, next_token),
FlatShape::ExternalArg => add_colored_token(&shape.1, next_token),
FlatShape::ExternalResolved => add_colored_token(&shape.1, next_token),
FlatShape::Keyword => add_colored_token(&shape.1, next_token),
FlatShape::Literal => add_colored_token(&shape.1, next_token),
FlatShape::Operator => add_colored_token(&shape.1, next_token),
@ -234,11 +252,11 @@ fn find_matching_block_end_in_block(
for e in &p.elements {
match e {
PipelineElement::Expression(_, e)
| PipelineElement::Redirection(_, _, e)
| PipelineElement::Redirection(_, _, e, _)
| PipelineElement::And(_, e)
| PipelineElement::Or(_, e)
| PipelineElement::SameTargetRedirection { cmd: (_, e), .. }
| PipelineElement::SeparateRedirection { out: (_, e), .. } => {
| PipelineElement::SeparateRedirection { out: (_, e, _), .. } => {
if e.span.contains(global_cursor_offset) {
if let Some(pos) = find_matching_block_end_in_expr(
line,
@ -315,6 +333,7 @@ fn find_matching_block_end_in_expr(
Expr::MatchBlock(_) => None,
Expr::Nothing => None,
Expr::Garbage => None,
Expr::Spread(_) => None,
Expr::Table(hdr, rows) => {
if expr_last == global_cursor_offset {
@ -346,10 +365,17 @@ fn find_matching_block_end_in_expr(
Some(expr_last)
} else {
// cursor is inside record
for (k, v) in exprs {
for expr in exprs {
match expr {
RecordItem::Pair(k, v) => {
find_in_expr_or_continue!(k);
find_in_expr_or_continue!(v);
}
RecordItem::Spread(_, record) => {
find_in_expr_or_continue!(record);
}
}
}
None
}
}

View File

@ -43,13 +43,13 @@ fn gather_env_vars(
let working_set = StateWorkingSet::new(engine_state);
report_error(
&working_set,
&ShellError::GenericError(
format!("Environment variable was not captured: {env_str}"),
"".to_string(),
None,
Some(msg.into()),
Vec::new(),
),
&ShellError::GenericError {
error: format!("Environment variable was not captured: {env_str}"),
msg: "".into(),
span: None,
help: Some(msg.into()),
inner: vec![],
},
);
}
@ -75,15 +75,15 @@ fn gather_env_vars(
let working_set = StateWorkingSet::new(engine_state);
report_error(
&working_set,
&ShellError::GenericError(
"Current directory is not a valid utf-8 path".to_string(),
"".to_string(),
None,
Some(format!(
&ShellError::GenericError {
error: "Current directory is not a valid utf-8 path".into(),
msg: "".into(),
span: None,
help: Some(format!(
"Retrieving current directory failed: {init_cwd:?} not a valid utf-8 path"
)),
Vec::new(),
),
inner: vec![],
},
);
}
}
@ -111,7 +111,7 @@ fn gather_env_vars(
let name = if let Some(Token {
contents: TokenContents::Item,
span,
}) = parts.get(0)
}) = parts.first()
{
let mut working_set = StateWorkingSet::new(engine_state);
let bytes = working_set.get_span_contents(*span);

View File

@ -59,6 +59,29 @@ fn extern_completer() -> NuCompleter {
NuCompleter::new(std::sync::Arc::new(engine), stack)
}
#[fixture]
fn custom_completer() -> NuCompleter {
// Create a new engine
let (dir, _, mut engine, mut stack) = new_engine();
// Add record value as example
let record = r#"
let external_completer = {|spans|
$spans
}
$env.config.completions.external = {
enable: true
max_results: 100
completer: $external_completer
}
"#;
assert!(support::merge_input(record.as_bytes(), &mut engine, &mut stack, dir).is_ok());
// Instantiate a new completer
NuCompleter::new(std::sync::Arc::new(engine), stack)
}
#[test]
fn variables_dollar_sign_with_varialblecompletion() {
let (_, _, engine, stack) = new_engine();
@ -122,14 +145,14 @@ fn dotnu_completions() {
let suggestions = completer.complete(&completion_str, completion_str.len());
assert_eq!(1, suggestions.len());
assert_eq!("custom_completion.nu", suggestions.get(0).unwrap().value);
assert_eq!("custom_completion.nu", suggestions.first().unwrap().value);
// Test use completion
let completion_str = "use ".to_string();
let suggestions = completer.complete(&completion_str, completion_str.len());
assert_eq!(1, suggestions.len());
assert_eq!("custom_completion.nu", suggestions.get(0).unwrap().value);
assert_eq!("custom_completion.nu", suggestions.first().unwrap().value);
}
#[test]
@ -141,7 +164,7 @@ fn external_completer_trailing_space() {
let suggestions = run_external_completion(block, &input);
assert_eq!(3, suggestions.len());
assert_eq!("gh", suggestions.get(0).unwrap().value);
assert_eq!("gh", suggestions.first().unwrap().value);
assert_eq!("alias", suggestions.get(1).unwrap().value);
assert_eq!("", suggestions.get(2).unwrap().value);
}
@ -153,7 +176,7 @@ fn external_completer_no_trailing_space() {
let suggestions = run_external_completion(block, &input);
assert_eq!(2, suggestions.len());
assert_eq!("gh", suggestions.get(0).unwrap().value);
assert_eq!("gh", suggestions.first().unwrap().value);
assert_eq!("alias", suggestions.get(1).unwrap().value);
}
@ -164,7 +187,7 @@ fn external_completer_pass_flags() {
let suggestions = run_external_completion(block, &input);
assert_eq!(3, suggestions.len());
assert_eq!("gh", suggestions.get(0).unwrap().value);
assert_eq!("gh", suggestions.first().unwrap().value);
assert_eq!("api", suggestions.get(1).unwrap().value);
assert_eq!("--", suggestions.get(2).unwrap().value);
}
@ -938,6 +961,34 @@ fn extern_complete_flags(mut extern_completer: NuCompleter) {
match_suggestions(expected, suggestions);
}
#[rstest]
fn custom_completer_triggers_cursor_before_word(mut custom_completer: NuCompleter) {
let suggestions = custom_completer.complete("cmd foo bar", 8);
let expected: Vec<String> = vec!["cmd".into(), "foo".into(), "".into()];
match_suggestions(expected, suggestions);
}
#[rstest]
fn custom_completer_triggers_cursor_on_word_left_boundary(mut custom_completer: NuCompleter) {
let suggestions = custom_completer.complete("cmd foo bar", 8);
let expected: Vec<String> = vec!["cmd".into(), "foo".into(), "".into()];
match_suggestions(expected, suggestions);
}
#[rstest]
fn custom_completer_triggers_cursor_next_to_word(mut custom_completer: NuCompleter) {
let suggestions = custom_completer.complete("cmd foo bar", 11);
let expected: Vec<String> = vec!["cmd".into(), "foo".into(), "bar".into()];
match_suggestions(expected, suggestions);
}
#[rstest]
fn custom_completer_triggers_cursor_after_word(mut custom_completer: NuCompleter) {
let suggestions = custom_completer.complete("cmd foo bar ", 12);
let expected: Vec<String> = vec!["cmd".into(), "foo".into(), "bar".into(), "".into()];
match_suggestions(expected, suggestions);
}
#[ignore = "was reverted, still needs fixing"]
#[rstest]
fn alias_offset_bug_7648() {

View File

@ -5,21 +5,21 @@ edition = "2021"
license = "MIT"
name = "nu-cmd-base"
repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cmd-base"
version = "0.87.0"
version = "0.88.1"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
nu-engine = { path = "../nu-engine", version = "0.87.0" }
nu-glob = { path = "../nu-glob", version = "0.87.0" }
nu-parser = { path = "../nu-parser", version = "0.87.0" }
nu-path = { path = "../nu-path", version = "0.87.0" }
nu-protocol = { path = "../nu-protocol", version = "0.87.0" }
nu-utils = { path = "../nu-utils", version = "0.87.0" }
nu-engine = { path = "../nu-engine", version = "0.88.1" }
nu-glob = { path = "../nu-glob", version = "0.88.1" }
nu-parser = { path = "../nu-parser", version = "0.88.1" }
nu-path = { path = "../nu-path", version = "0.88.1" }
nu-protocol = { path = "../nu-protocol", version = "0.88.1" }
nu-utils = { path = "../nu-utils", version = "0.88.1" }
indexmap = "2.1"
miette = "5.10.0"
[dev-dependencies]
nu-test-support = { path = "../nu-test-support", version = "0.87.0" }
nu-test-support = { path = "../nu-test-support", version = "0.88.1" }
rstest = "0.18.2"

View File

@ -64,12 +64,10 @@ fn arg_glob_opt(
// user wasn't referring to a specific thing in filesystem, try to glob it.
match glob_with_parent(&pattern.item, options, cwd) {
Ok(p) => Ok(p),
Err(pat_err) => {
Err(ShellError::InvalidGlobPattern(
pat_err.msg.into(),
pattern.span, // improve specificity
))
}
Err(pat_err) => Err(ShellError::InvalidGlobPattern {
msg: pat_err.msg.into(),
span: pattern.span,
}),
}
}

View File

@ -90,11 +90,11 @@ pub fn eval_hook(
if let Some(err) = working_set.parse_errors.first() {
report_error(&working_set, err);
return Err(ShellError::UnsupportedConfigValue(
"valid source code".into(),
"source code with syntax errors".into(),
return Err(ShellError::UnsupportedConfigValue {
expected: "valid source code".into(),
value: "source code with syntax errors".into(),
span,
));
});
}
(output, working_set.render(), vars)
@ -164,11 +164,11 @@ pub fn eval_hook(
{
val
} else {
return Err(ShellError::UnsupportedConfigValue(
"boolean output".to_string(),
"other PipelineData variant".to_string(),
other_span,
));
return Err(ShellError::UnsupportedConfigValue {
expected: "boolean output".to_string(),
value: "other PipelineData variant".to_string(),
span: other_span,
});
}
}
Err(err) => {
@ -176,11 +176,11 @@ pub fn eval_hook(
}
}
} else {
return Err(ShellError::UnsupportedConfigValue(
"block".to_string(),
format!("{}", condition.get_type()),
other_span,
));
return Err(ShellError::UnsupportedConfigValue {
expected: "block".to_string(),
value: format!("{}", condition.get_type()),
span: other_span,
});
}
} else {
// always run the hook
@ -222,11 +222,11 @@ pub fn eval_hook(
if let Some(err) = working_set.parse_errors.first() {
report_error(&working_set, err);
return Err(ShellError::UnsupportedConfigValue(
"valid source code".into(),
"source code with syntax errors".into(),
source_span,
));
return Err(ShellError::UnsupportedConfigValue {
expected: "valid source code".into(),
value: "source code with syntax errors".into(),
span: source_span,
});
}
(output, working_set.render(), vars)
@ -277,11 +277,11 @@ pub fn eval_hook(
)?;
}
other => {
return Err(ShellError::UnsupportedConfigValue(
"block or string".to_string(),
format!("{}", other.get_type()),
source_span,
));
return Err(ShellError::UnsupportedConfigValue {
expected: "block or string".to_string(),
value: format!("{}", other.get_type()),
span: source_span,
});
}
}
}
@ -293,11 +293,11 @@ pub fn eval_hook(
output = run_hook_block(engine_state, stack, val.block_id, input, arguments, span)?;
}
other => {
return Err(ShellError::UnsupportedConfigValue(
"string, block, record, or list of commands".into(),
format!("{}", other.get_type()),
other.span(),
));
return Err(ShellError::UnsupportedConfigValue {
expected: "string, block, record, or list of commands".into(),
value: format!("{}", other.get_type()),
span: other.span(),
});
}
}

View File

@ -72,22 +72,22 @@ fn get_editor_commandline(
let params = editor_cmd.collect::<Result<_, ShellError>>()?;
Ok((editor, params))
}
_ => Err(ShellError::GenericError(
"Editor executable is missing".into(),
"Set the first element to an executable".into(),
Some(value.span()),
Some(HELP_MSG.into()),
vec![],
)),
_ => Err(ShellError::GenericError {
error: "Editor executable is missing".into(),
msg: "Set the first element to an executable".into(),
span: Some(value.span()),
help: Some(HELP_MSG.into()),
inner: vec![],
}),
}
}
Value::String { .. } | Value::List { .. } => Err(ShellError::GenericError(
format!("{var_name} should be a non-empty string or list<String>"),
"Specify an executable here".into(),
Some(value.span()),
Some(HELP_MSG.into()),
vec![],
)),
Value::String { .. } | Value::List { .. } => Err(ShellError::GenericError {
error: format!("{var_name} should be a non-empty string or list<String>"),
msg: "Specify an executable here".into(),
span: Some(value.span()),
help: Some(HELP_MSG.into()),
inner: vec![],
}),
x => Err(ShellError::CantConvert {
to_type: "string or list<string>".into(),
from_type: x.get_type().to_string(),
@ -114,13 +114,14 @@ pub fn get_editor(
} else if let Some(value) = env_vars.get("VISUAL") {
get_editor_commandline(value, "$env.VISUAL")
} else {
Err(ShellError::GenericError(
"No editor configured".into(),
Err(ShellError::GenericError {
error: "No editor configured".into(),
msg:
"Please specify one via `$env.config.buffer_editor` or `$env.EDITOR`/`$env.VISUAL`"
.into(),
Some(span),
Some(HELP_MSG.into()),
vec![],
))
span: Some(span),
help: Some(HELP_MSG.into()),
inner: vec![],
})
}
}

View File

@ -5,7 +5,7 @@ edition = "2021"
license = "MIT"
name = "nu-cmd-dataframe"
repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cmd-dataframe"
version = "0.87.0"
version = "0.88.1"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -13,19 +13,22 @@ version = "0.87.0"
bench = false
[dependencies]
nu-engine = { path = "../nu-engine", version = "0.87.0" }
nu-parser = { path = "../nu-parser", version = "0.87.0" }
nu-protocol = { path = "../nu-protocol", version = "0.87.0" }
nu-engine = { path = "../nu-engine", version = "0.88.1" }
nu-parser = { path = "../nu-parser", version = "0.88.1" }
nu-protocol = { path = "../nu-protocol", version = "0.88.1" }
# Potential dependencies for extras
chrono = { version = "0.4", features = ["std", "unstable-locales"], default-features = false }
chrono-tz = "0.8"
fancy-regex = "0.11"
fancy-regex = "0.12"
indexmap = { version = "2.1" }
num = { version = "0.4", optional = true }
serde = { version = "1.0", features = ["derive"] }
sqlparser = { version = "0.36.1", optional = true }
polars-io = { version = "0.33", features = ["avro"], optional = true }
sqlparser = { version = "0.39", optional = true }
polars-io = { version = "0.35", features = ["avro"], optional = true }
polars-arrow = "0.35"
polars-ops = "0.35"
polars-plan = "0.35"
[dependencies.polars]
features = [
@ -59,12 +62,12 @@ features = [
"to_dummies",
]
optional = true
version = "0.33"
version = "0.35"
[features]
dataframe = ["num", "polars", "polars-io", "sqlparser"]
default = []
[dev-dependencies]
nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.87.0" }
nu-test-support = { path = "../nu-test-support", version = "0.87.0" }
nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.88.1" }
nu-test-support = { path = "../nu-test-support", version = "0.88.1" }

View File

@ -68,25 +68,23 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let new_df = col_string
.get(0)
.ok_or_else(|| {
ShellError::GenericError(
"Empty names list".into(),
"No column names were found".into(),
Some(col_span),
None,
Vec::new(),
)
.first()
.ok_or_else(|| ShellError::GenericError {
error: "Empty names list".into(),
msg: "No column names were found".into(),
span: Some(col_span),
help: None,
inner: vec![],
})
.and_then(|col| {
df.as_ref().drop(&col.item).map_err(|e| {
ShellError::GenericError(
"Error dropping column".into(),
e.to_string(),
Some(col.span),
None,
Vec::new(),
)
df.as_ref()
.drop(&col.item)
.map_err(|e| ShellError::GenericError {
error: "Error dropping column".into(),
msg: e.to_string(),
span: Some(col.span),
help: None,
inner: vec![],
})
})?;
@ -96,14 +94,14 @@ fn command(
.iter()
.skip(1)
.try_fold(new_df, |new_df, col| {
new_df.drop(&col.item).map_err(|e| {
ShellError::GenericError(
"Error dropping column".into(),
e.to_string(),
Some(col.span),
None,
Vec::new(),
)
new_df
.drop(&col.item)
.map_err(|e| ShellError::GenericError {
error: "Error dropping column".into(),
msg: e.to_string(),
span: Some(col.span),
help: None,
inner: vec![],
})
})
.map(|df| PipelineData::Value(NuDataFrame::dataframe_into_value(df, call.head), None))

View File

@ -100,14 +100,12 @@ fn command(
df.as_ref()
.unique(subset_slice, keep_strategy, None)
.map_err(|e| {
ShellError::GenericError(
"Error dropping duplicates".into(),
e.to_string(),
Some(col_span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error dropping duplicates".into(),
msg: e.to_string(),
span: Some(col_span),
help: None,
inner: vec![],
})
.map(|df| PipelineData::Value(NuDataFrame::dataframe_into_value(df, call.head), None))
}

View File

@ -115,14 +115,12 @@ fn command(
df.as_ref()
.drop_nulls(subset_slice)
.map_err(|e| {
ShellError::GenericError(
"Error dropping nulls".into(),
e.to_string(),
Some(col_span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error dropping nulls".into(),
msg: e.to_string(),
span: Some(col_span),
help: None,
inner: vec![],
})
.map(|df| PipelineData::Value(NuDataFrame::dataframe_into_value(df, call.head), None))
}

View File

@ -88,14 +88,12 @@ fn command(
df.as_ref()
.to_dummies(None, drop_first)
.map_err(|e| {
ShellError::GenericError(
"Error calculating dummies".into(),
e.to_string(),
Some(call.head),
Some("The only allowed column types for dummies are String or Int".into()),
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error calculating dummies".into(),
msg: e.to_string(),
span: Some(call.head),
help: Some("The only allowed column types for dummies are String or Int".into()),
inner: vec![],
})
.map(|df| PipelineData::Value(NuDataFrame::dataframe_into_value(df, call.head), None))
}

View File

@ -105,26 +105,22 @@ fn command_eager(
))
} else {
let mask = NuDataFrame::try_from_value(mask_value)?.as_series(mask_span)?;
let mask = mask.bool().map_err(|e| {
ShellError::GenericError(
"Error casting to bool".into(),
e.to_string(),
Some(mask_span),
Some("Perhaps you want to use a series with booleans as mask".into()),
Vec::new(),
)
let mask = mask.bool().map_err(|e| ShellError::GenericError {
error: "Error casting to bool".into(),
msg: e.to_string(),
span: Some(mask_span),
help: Some("Perhaps you want to use a series with booleans as mask".into()),
inner: vec![],
})?;
df.as_ref()
.filter(mask)
.map_err(|e| {
ShellError::GenericError(
"Error filtering dataframe".into(),
e.to_string(),
Some(call.head),
Some("The only allowed column types for dummies are String or Int".into()),
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error filtering dataframe".into(),
msg: e.to_string(),
span: Some(call.head),
help: Some("The only allowed column types for dummies are String or Int".into()),
inner: vec![],
})
.map(|df| PipelineData::Value(NuDataFrame::dataframe_into_value(df, call.head), None))
}

View File

@ -70,14 +70,12 @@ fn command(
df.as_ref()
.select(col_string)
.map_err(|e| {
ShellError::GenericError(
"Error selecting columns".into(),
e.to_string(),
Some(col_span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error selecting columns".into(),
msg: e.to_string(),
span: Some(col_span),
help: None,
inner: vec![],
})
.map(|df| PipelineData::Value(NuDataFrame::dataframe_into_value(df, call.head), None))
}

View File

@ -152,37 +152,33 @@ fn command(
let mut res = df
.as_ref()
.melt(&id_col_string, &val_col_string)
.map_err(|e| {
ShellError::GenericError(
"Error calculating melt".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error calculating melt".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
if let Some(name) = &variable_name {
res.rename("variable", &name.item).map_err(|e| {
ShellError::GenericError(
"Error renaming column".into(),
e.to_string(),
Some(name.span),
None,
Vec::new(),
)
res.rename("variable", &name.item)
.map_err(|e| ShellError::GenericError {
error: "Error renaming column".into(),
msg: e.to_string(),
span: Some(name.span),
help: None,
inner: vec![],
})?;
}
if let Some(name) = &value_name {
res.rename("value", &name.item).map_err(|e| {
ShellError::GenericError(
"Error renaming column".into(),
e.to_string(),
Some(name.span),
None,
Vec::new(),
)
res.rename("value", &name.item)
.map_err(|e| ShellError::GenericError {
error: "Error renaming column".into(),
msg: e.to_string(),
span: Some(name.span),
help: None,
inner: vec![],
})?;
}
@ -198,50 +194,50 @@ fn check_column_datatypes<T: AsRef<str>>(
col_span: Span,
) -> Result<(), ShellError> {
if cols.is_empty() {
return Err(ShellError::GenericError(
"Merge error".into(),
"empty column list".into(),
Some(col_span),
None,
Vec::new(),
));
return Err(ShellError::GenericError {
error: "Merge error".into(),
msg: "empty column list".into(),
span: Some(col_span),
help: None,
inner: vec![],
});
}
// Checking if they are same type
if cols.len() > 1 {
for w in cols.windows(2) {
let l_series = df.column(w[0].as_ref()).map_err(|e| {
ShellError::GenericError(
"Error selecting columns".into(),
e.to_string(),
Some(col_span),
None,
Vec::new(),
)
let l_series = df
.column(w[0].as_ref())
.map_err(|e| ShellError::GenericError {
error: "Error selecting columns".into(),
msg: e.to_string(),
span: Some(col_span),
help: None,
inner: vec![],
})?;
let r_series = df.column(w[1].as_ref()).map_err(|e| {
ShellError::GenericError(
"Error selecting columns".into(),
e.to_string(),
Some(col_span),
None,
Vec::new(),
)
let r_series = df
.column(w[1].as_ref())
.map_err(|e| ShellError::GenericError {
error: "Error selecting columns".into(),
msg: e.to_string(),
span: Some(col_span),
help: None,
inner: vec![],
})?;
if l_series.dtype() != r_series.dtype() {
return Err(ShellError::GenericError(
"Merge error".into(),
"found different column types in list".into(),
Some(col_span),
Some(format!(
return Err(ShellError::GenericError {
error: "Merge error".into(),
msg: "found different column types in list".into(),
span: Some(col_span),
help: Some(format!(
"datatypes {} and {} are incompatible",
l_series.dtype(),
r_series.dtype()
)),
Vec::new(),
));
inner: vec![],
});
}
}
}

View File

@ -121,15 +121,15 @@ fn command(
"json" => from_json(engine_state, stack, call),
"jsonl" => from_jsonl(engine_state, stack, call),
"avro" => from_avro(engine_state, stack, call),
_ => Err(ShellError::FileNotFoundCustom(
format!("{msg}. Supported values: csv, tsv, parquet, ipc, arrow, json"),
blamed,
)),
_ => Err(ShellError::FileNotFoundCustom {
msg: format!("{msg}. Supported values: csv, tsv, parquet, ipc, arrow, json"),
span: blamed,
}),
},
None => Err(ShellError::FileNotFoundCustom(
"File without extension".into(),
file.span,
)),
None => Err(ShellError::FileNotFoundCustom {
msg: "File without extension".into(),
span: file.span,
}),
}
.map(|value| PipelineData::Value(value, None))
}
@ -150,17 +150,16 @@ fn from_parquet(
low_memory: false,
cloud_options: None,
use_statistics: false,
hive_partitioning: false,
};
let df: NuLazyFrame = LazyFrame::scan_parquet(file, args)
.map_err(|e| {
ShellError::GenericError(
"Parquet reader error".into(),
format!("{e:?}"),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Parquet reader error".into(),
msg: format!("{e:?}"),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into();
@ -169,14 +168,12 @@ fn from_parquet(
let file: Spanned<PathBuf> = call.req(engine_state, stack, 0)?;
let columns: Option<Vec<String>> = call.get_flag(engine_state, stack, "columns")?;
let r = File::open(&file.item).map_err(|e| {
ShellError::GenericError(
"Error opening file".into(),
e.to_string(),
Some(file.span),
None,
Vec::new(),
)
let r = File::open(&file.item).map_err(|e| ShellError::GenericError {
error: "Error opening file".into(),
msg: e.to_string(),
span: Some(file.span),
help: None,
inner: vec![],
})?;
let reader = ParquetReader::new(r);
@ -187,14 +184,12 @@ fn from_parquet(
let df: NuDataFrame = reader
.finish()
.map_err(|e| {
ShellError::GenericError(
"Parquet reader error".into(),
format!("{e:?}"),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Parquet reader error".into(),
msg: format!("{e:?}"),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into();
@ -210,14 +205,12 @@ fn from_avro(
let file: Spanned<PathBuf> = call.req(engine_state, stack, 0)?;
let columns: Option<Vec<String>> = call.get_flag(engine_state, stack, "columns")?;
let r = File::open(&file.item).map_err(|e| {
ShellError::GenericError(
"Error opening file".into(),
e.to_string(),
Some(file.span),
None,
Vec::new(),
)
let r = File::open(&file.item).map_err(|e| ShellError::GenericError {
error: "Error opening file".into(),
msg: e.to_string(),
span: Some(file.span),
help: None,
inner: vec![],
})?;
let reader = AvroReader::new(r);
@ -228,14 +221,12 @@ fn from_avro(
let df: NuDataFrame = reader
.finish()
.map_err(|e| {
ShellError::GenericError(
"Avro reader error".into(),
format!("{e:?}"),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Avro reader error".into(),
msg: format!("{e:?}"),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into();
@ -258,14 +249,12 @@ fn from_ipc(
};
let df: NuLazyFrame = LazyFrame::scan_ipc(file, args)
.map_err(|e| {
ShellError::GenericError(
"IPC reader error".into(),
format!("{e:?}"),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "IPC reader error".into(),
msg: format!("{e:?}"),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into();
@ -274,14 +263,12 @@ fn from_ipc(
let file: Spanned<PathBuf> = call.req(engine_state, stack, 0)?;
let columns: Option<Vec<String>> = call.get_flag(engine_state, stack, "columns")?;
let r = File::open(&file.item).map_err(|e| {
ShellError::GenericError(
"Error opening file".into(),
e.to_string(),
Some(file.span),
None,
Vec::new(),
)
let r = File::open(&file.item).map_err(|e| ShellError::GenericError {
error: "Error opening file".into(),
msg: e.to_string(),
span: Some(file.span),
help: None,
inner: vec![],
})?;
let reader = IpcReader::new(r);
@ -292,14 +279,12 @@ fn from_ipc(
let df: NuDataFrame = reader
.finish()
.map_err(|e| {
ShellError::GenericError(
"IPC reader error".into(),
format!("{e:?}"),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "IPC reader error".into(),
msg: format!("{e:?}"),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into();
@ -313,14 +298,12 @@ fn from_json(
call: &Call,
) -> Result<Value, ShellError> {
let file: Spanned<PathBuf> = call.req(engine_state, stack, 0)?;
let file = File::open(&file.item).map_err(|e| {
ShellError::GenericError(
"Error opening file".into(),
e.to_string(),
Some(file.span),
None,
Vec::new(),
)
let file = File::open(&file.item).map_err(|e| ShellError::GenericError {
error: "Error opening file".into(),
msg: e.to_string(),
span: Some(file.span),
help: None,
inner: vec![],
})?;
let buf_reader = BufReader::new(file);
@ -328,14 +311,12 @@ fn from_json(
let df: NuDataFrame = reader
.finish()
.map_err(|e| {
ShellError::GenericError(
"Json reader error".into(),
format!("{e:?}"),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Json reader error".into(),
msg: format!("{e:?}"),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into();
@ -349,14 +330,12 @@ fn from_jsonl(
) -> Result<Value, ShellError> {
let infer_schema: Option<usize> = call.get_flag(engine_state, stack, "infer-schema")?;
let file: Spanned<PathBuf> = call.req(engine_state, stack, 0)?;
let file = File::open(&file.item).map_err(|e| {
ShellError::GenericError(
"Error opening file".into(),
e.to_string(),
Some(file.span),
None,
Vec::new(),
)
let file = File::open(&file.item).map_err(|e| ShellError::GenericError {
error: "Error opening file".into(),
msg: e.to_string(),
span: Some(file.span),
help: None,
inner: vec![],
})?;
let buf_reader = BufReader::new(file);
@ -366,14 +345,12 @@ fn from_jsonl(
let df: NuDataFrame = reader
.finish()
.map_err(|e| {
ShellError::GenericError(
"Json lines reader error".into(),
format!("{e:?}"),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Json lines reader error".into(),
msg: format!("{e:?}"),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into();
@ -399,19 +376,19 @@ fn from_csv(
None => csv_reader,
Some(d) => {
if d.item.len() != 1 {
return Err(ShellError::GenericError(
"Incorrect delimiter".into(),
"Delimiter has to be one character".into(),
Some(d.span),
None,
Vec::new(),
));
return Err(ShellError::GenericError {
error: "Incorrect delimiter".into(),
msg: "Delimiter has to be one character".into(),
span: Some(d.span),
help: None,
inner: vec![],
});
} else {
let delimiter = match d.item.chars().next() {
Some(d) => d as u8,
None => unreachable!(),
};
csv_reader.with_delimiter(delimiter)
csv_reader.with_separator(delimiter)
}
}
};
@ -430,14 +407,12 @@ fn from_csv(
let df: NuLazyFrame = csv_reader
.finish()
.map_err(|e| {
ShellError::GenericError(
"Parquet reader error".into(),
format!("{e:?}"),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Parquet reader error".into(),
msg: format!("{e:?}"),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into();
@ -445,14 +420,12 @@ fn from_csv(
} else {
let file: Spanned<PathBuf> = call.req(engine_state, stack, 0)?;
let csv_reader = CsvReader::from_path(&file.item)
.map_err(|e| {
ShellError::GenericError(
"Error creating CSV reader".into(),
e.to_string(),
Some(file.span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error creating CSV reader".into(),
msg: e.to_string(),
span: Some(file.span),
help: None,
inner: vec![],
})?
.with_encoding(CsvEncoding::LossyUtf8);
@ -460,19 +433,19 @@ fn from_csv(
None => csv_reader,
Some(d) => {
if d.item.len() != 1 {
return Err(ShellError::GenericError(
"Incorrect delimiter".into(),
"Delimiter has to be one character".into(),
Some(d.span),
None,
Vec::new(),
));
return Err(ShellError::GenericError {
error: "Incorrect delimiter".into(),
msg: "Delimiter has to be one character".into(),
span: Some(d.span),
help: None,
inner: vec![],
});
} else {
let delimiter = match d.item.chars().next() {
Some(d) => d as u8,
None => unreachable!(),
};
csv_reader.with_delimiter(delimiter)
csv_reader.with_separator(delimiter)
}
}
};
@ -496,14 +469,12 @@ fn from_csv(
let df: NuDataFrame = csv_reader
.finish()
.map_err(|e| {
ShellError::GenericError(
"Parquet reader error".into(),
format!("{e:?}"),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Parquet reader error".into(),
msg: format!("{e:?}"),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into();

View File

@ -76,14 +76,14 @@ fn command(
let mut ctx = SQLContext::new();
ctx.register("df", &df.df);
let df_sql = ctx.execute(&sql_query).map_err(|e| {
ShellError::GenericError(
"Dataframe Error".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let df_sql = ctx
.execute(&sql_query)
.map_err(|e| ShellError::GenericError {
error: "Dataframe Error".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let lazy = NuLazyFrame::new(false, df_sql);

View File

@ -130,14 +130,14 @@ fn command_eager(
let new_names = extract_strings(new_names)?;
for (from, to) in columns.iter().zip(new_names.iter()) {
df.as_mut().rename(from, to).map_err(|e| {
ShellError::GenericError(
"Error renaming".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
df.as_mut()
.rename(from, to)
.map_err(|e| ShellError::GenericError {
error: "Error renaming".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
}

View File

@ -4,6 +4,8 @@ use nu_protocol::{
engine::{Command, EngineState, Stack},
Category, Example, PipelineData, ShellError, Signature, Spanned, SyntaxShape, Type,
};
use polars::prelude::NamedFrom;
use polars::series::Series;
use super::super::values::NuDataFrame;
@ -81,7 +83,7 @@ fn command(
call: &Call,
input: PipelineData,
) -> Result<PipelineData, ShellError> {
let rows: Option<Spanned<usize>> = call.get_flag(engine_state, stack, "n-rows")?;
let rows: Option<Spanned<i64>> = call.get_flag(engine_state, stack, "n-rows")?;
let fraction: Option<Spanned<f64>> = call.get_flag(engine_state, stack, "fraction")?;
let seed: Option<u64> = call
.get_flag::<i64>(engine_state, stack, "seed")?
@ -94,42 +96,38 @@ fn command(
match (rows, fraction) {
(Some(rows), None) => df
.as_ref()
.sample_n(rows.item, replace, shuffle, seed)
.map_err(|e| {
ShellError::GenericError(
"Error creating sample".into(),
e.to_string(),
Some(rows.span),
None,
Vec::new(),
)
.sample_n(&Series::new("s", &[rows.item]), replace, shuffle, seed)
.map_err(|e| ShellError::GenericError {
error: "Error creating sample".into(),
msg: e.to_string(),
span: Some(rows.span),
help: None,
inner: vec![],
}),
(None, Some(frac)) => df
.as_ref()
.sample_frac(frac.item, replace, shuffle, seed)
.map_err(|e| {
ShellError::GenericError(
"Error creating sample".into(),
e.to_string(),
Some(frac.span),
None,
Vec::new(),
)
.sample_frac(&Series::new("frac", &[frac.item]), replace, shuffle, seed)
.map_err(|e| ShellError::GenericError {
error: "Error creating sample".into(),
msg: e.to_string(),
span: Some(frac.span),
help: None,
inner: vec![],
}),
(Some(_), Some(_)) => Err(ShellError::GenericError {
error: "Incompatible flags".into(),
msg: "Only one selection criterion allowed".into(),
span: Some(call.head),
help: None,
inner: vec![],
}),
(None, None) => Err(ShellError::GenericError {
error: "No selection".into(),
msg: "No selection criterion was found".into(),
span: Some(call.head),
help: Some("Perhaps you want to use the flag -n or -f".into()),
inner: vec![],
}),
(Some(_), Some(_)) => Err(ShellError::GenericError(
"Incompatible flags".into(),
"Only one selection criterion allowed".into(),
Some(call.head),
None,
Vec::new(),
)),
(None, None) => Err(ShellError::GenericError(
"No selection".into(),
"No selection criterion was found".into(),
Some(call.head),
Some("Perhaps you want to use the flag -n or -f".into()),
Vec::new(),
)),
}
.map(|df| PipelineData::Value(NuDataFrame::dataframe_into_value(df, call.head), None))
}

View File

@ -2,7 +2,8 @@ use crate::dataframe::eager::sql_expr::parse_sql_expr;
use polars::error::{ErrString, PolarsError};
use polars::prelude::{col, DataFrame, DataType, IntoLazy, LazyFrame};
use sqlparser::ast::{
Expr as SqlExpr, Select, SelectItem, SetExpr, Statement, TableFactor, Value as SQLValue,
Expr as SqlExpr, GroupByExpr, Select, SelectItem, SetExpr, Statement, TableFactor,
Value as SQLValue,
};
use sqlparser::dialect::GenericDialect;
use sqlparser::parser::Parser;
@ -29,7 +30,7 @@ impl SQLContext {
fn execute_select(&self, select_stmt: &Select) -> Result<LazyFrame, PolarsError> {
// Determine involved dataframe
// Implicit join require some more work in query parsers, Explicit join are preferred for now.
let tbl = select_stmt.from.get(0).ok_or_else(|| {
let tbl = select_stmt.from.first().ok_or_else(|| {
PolarsError::ComputeError(ErrString::from("No table found in select statement"))
})?;
let mut alias_map = HashMap::new();
@ -37,7 +38,7 @@ impl SQLContext {
TableFactor::Table { name, alias, .. } => {
let tbl_name = name
.0
.get(0)
.first()
.ok_or_else(|| {
PolarsError::ComputeError(ErrString::from(
"No table found in select statement",
@ -96,8 +97,13 @@ impl SQLContext {
.collect::<Result<Vec<_>, PolarsError>>()?;
// Check for group by
// After projection since there might be number.
let group_by = select_stmt
.group_by
let group_by = match &select_stmt.group_by {
GroupByExpr::All =>
Err(
PolarsError::ComputeError("Group-By Error: Only positive number or expression are supported, not all".into())
)?,
GroupByExpr::Expressions(expressions) => expressions
}
.iter()
.map(
|e|match e {
@ -182,7 +188,7 @@ impl SQLContext {
))
} else {
let ast = ast
.get(0)
.first()
.ok_or_else(|| PolarsError::ComputeError(ErrString::from("No statement found")))?;
Ok(match ast {
Statement::Query(query) => {

View File

@ -2,8 +2,8 @@ use polars::error::PolarsError;
use polars::prelude::{col, lit, DataType, Expr, LiteralValue, PolarsResult as Result, TimeUnit};
use sqlparser::ast::{
BinaryOperator as SQLBinaryOperator, DataType as SQLDataType, Expr as SqlExpr,
Function as SQLFunction, Value as SqlValue, WindowType,
ArrayElemTypeDef, BinaryOperator as SQLBinaryOperator, DataType as SQLDataType,
Expr as SqlExpr, Function as SQLFunction, Value as SqlValue, WindowType,
};
fn map_sql_polars_datatype(data_type: &SQLDataType) -> Result<DataType> {
@ -13,7 +13,7 @@ fn map_sql_polars_datatype(data_type: &SQLDataType) -> Result<DataType> {
| SQLDataType::Uuid
| SQLDataType::Clob(_)
| SQLDataType::Text
| SQLDataType::String => DataType::Utf8,
| SQLDataType::String(_) => DataType::Utf8,
SQLDataType::Float(_) => DataType::Float32,
SQLDataType::Real => DataType::Float32,
SQLDataType::Double => DataType::Float64,
@ -31,9 +31,12 @@ fn map_sql_polars_datatype(data_type: &SQLDataType) -> Result<DataType> {
SQLDataType::Time(_, _) => DataType::Time,
SQLDataType::Timestamp(_, _) => DataType::Datetime(TimeUnit::Microseconds, None),
SQLDataType::Interval => DataType::Duration(TimeUnit::Microseconds),
SQLDataType::Array(inner_type) => match inner_type {
Some(inner_type) => DataType::List(Box::new(map_sql_polars_datatype(inner_type)?)),
None => {
SQLDataType::Array(array_type_def) => match array_type_def {
ArrayElemTypeDef::AngleBracket(inner_type)
| ArrayElemTypeDef::SquareBracket(inner_type) => {
DataType::List(Box::new(map_sql_polars_datatype(inner_type)?))
}
_ => {
return Err(PolarsError::ComputeError(
"SQL Datatype Array(None) was not supported in polars-sql yet!".into(),
))
@ -114,7 +117,11 @@ pub fn parse_sql_expr(expr: &SqlExpr) -> Result<Expr> {
binary_op_(left, right, op)?
}
SqlExpr::Function(sql_function) => parse_sql_function(sql_function)?,
SqlExpr::Cast { expr, data_type } => cast_(parse_sql_expr(expr)?, data_type)?,
SqlExpr::Cast {
expr,
data_type,
format: _,
} => cast_(parse_sql_expr(expr)?, data_type)?,
SqlExpr::Nested(expr) => parse_sql_expr(expr)?,
SqlExpr::Value(value) => literal_expr(value)?,
_ => {

View File

@ -127,23 +127,23 @@ fn command(
if (&0.0..=&1.0).contains(&val) {
Ok(*val)
} else {
Err(ShellError::GenericError(
"Incorrect value for quantile".to_string(),
"value should be between 0 and 1".to_string(),
Some(span),
None,
Vec::new(),
))
Err(ShellError::GenericError {
error: "Incorrect value for quantile".into(),
msg: "value should be between 0 and 1".into(),
span: Some(span),
help: None,
inner: vec![],
})
}
}
Value::Error { error, .. } => Err(*error.clone()),
_ => Err(ShellError::GenericError(
"Incorrect value for quantile".to_string(),
"value should be a float".to_string(),
Some(span),
None,
Vec::new(),
)),
_ => Err(ShellError::GenericError {
error: "Incorrect value for quantile".into(),
msg: "value should be a float".into(),
span: Some(span),
help: None,
inner: vec![],
}),
}
})
.collect::<Result<Vec<f64>, ShellError>>()
@ -250,14 +250,12 @@ fn command(
let res = head.chain(tail).collect::<Vec<Series>>();
DataFrame::new(res)
.map_err(|e| {
ShellError::GenericError(
"Dataframe Error".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Dataframe Error".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})
.map(|df| PipelineData::Value(NuDataFrame::dataframe_into_value(df, call.head), None))
}

View File

@ -97,47 +97,41 @@ fn command(
let index = NuDataFrame::try_from_value(index_value)?.as_series(index_span)?;
let casted = match index.dtype() {
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => {
index.cast(&DataType::UInt32).map_err(|e| {
ShellError::GenericError(
"Error casting index list".into(),
e.to_string(),
Some(index_span),
None,
Vec::new(),
)
})
}
_ => Err(ShellError::GenericError(
"Incorrect type".into(),
"Series with incorrect type".into(),
Some(call.head),
Some("Consider using a Series with type int type".into()),
Vec::new(),
)),
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => index
.cast(&DataType::UInt32)
.map_err(|e| ShellError::GenericError {
error: "Error casting index list".into(),
msg: e.to_string(),
span: Some(index_span),
help: None,
inner: vec![],
}),
_ => Err(ShellError::GenericError {
error: "Incorrect type".into(),
msg: "Series with incorrect type".into(),
span: Some(call.head),
help: Some("Consider using a Series with type int type".into()),
inner: vec![],
}),
}?;
let indices = casted.u32().map_err(|e| {
ShellError::GenericError(
"Error casting index list".into(),
e.to_string(),
Some(index_span),
None,
Vec::new(),
)
let indices = casted.u32().map_err(|e| ShellError::GenericError {
error: "Error casting index list".into(),
msg: e.to_string(),
span: Some(index_span),
help: None,
inner: vec![],
})?;
NuDataFrame::try_from_pipeline(input, call.head).and_then(|df| {
df.as_ref()
.take(indices)
.map_err(|e| {
ShellError::GenericError(
"Error taking values".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error taking values".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})
.map(|df| PipelineData::Value(NuDataFrame::dataframe_into_value(df, call.head), None))
})

View File

@ -58,24 +58,22 @@ fn command(
let mut df = NuDataFrame::try_from_pipeline(input, call.head)?;
let mut file = File::create(&file_name.item).map_err(|e| {
ShellError::GenericError(
"Error with file name".into(),
e.to_string(),
Some(file_name.span),
None,
Vec::new(),
)
let mut file = File::create(&file_name.item).map_err(|e| ShellError::GenericError {
error: "Error with file name".into(),
msg: e.to_string(),
span: Some(file_name.span),
help: None,
inner: vec![],
})?;
IpcWriter::new(&mut file).finish(df.as_mut()).map_err(|e| {
ShellError::GenericError(
"Error saving file".into(),
e.to_string(),
Some(file_name.span),
None,
Vec::new(),
)
IpcWriter::new(&mut file)
.finish(df.as_mut())
.map_err(|e| ShellError::GenericError {
error: "Error saving file".into(),
msg: e.to_string(),
span: Some(file_name.span),
help: None,
inner: vec![],
})?;
let file_value = Value::string(format!("saved {:?}", &file_name.item), file_name.span);

View File

@ -85,27 +85,23 @@ fn command(
let mut df = NuDataFrame::try_from_pipeline(input, call.head)?;
let file = File::create(&file_name.item).map_err(|e| {
ShellError::GenericError(
"Error with file name".into(),
e.to_string(),
Some(file_name.span),
None,
Vec::new(),
)
let file = File::create(&file_name.item).map_err(|e| ShellError::GenericError {
error: "Error with file name".into(),
msg: e.to_string(),
span: Some(file_name.span),
help: None,
inner: vec![],
})?;
AvroWriter::new(file)
.with_compression(compression)
.finish(df.as_mut())
.map_err(|e| {
ShellError::GenericError(
"Error saving file".into(),
e.to_string(),
Some(file_name.span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error saving file".into(),
msg: e.to_string(),
span: Some(file_name.span),
help: None,
inner: vec![],
})?;
let file_value = Value::string(format!("saved {:?}", &file_name.item), file_name.span);

View File

@ -74,54 +74,52 @@ fn command(
let mut df = NuDataFrame::try_from_pipeline(input, call.head)?;
let mut file = File::create(&file_name.item).map_err(|e| {
ShellError::GenericError(
"Error with file name".into(),
e.to_string(),
Some(file_name.span),
None,
Vec::new(),
)
let mut file = File::create(&file_name.item).map_err(|e| ShellError::GenericError {
error: "Error with file name".into(),
msg: e.to_string(),
span: Some(file_name.span),
help: None,
inner: vec![],
})?;
let writer = CsvWriter::new(&mut file);
let writer = if no_header {
writer.has_header(false)
writer.include_header(false)
} else {
writer.has_header(true)
writer.include_header(true)
};
let mut writer = match delimiter {
None => writer,
Some(d) => {
if d.item.len() != 1 {
return Err(ShellError::GenericError(
"Incorrect delimiter".into(),
"Delimiter has to be one char".into(),
Some(d.span),
None,
Vec::new(),
));
return Err(ShellError::GenericError {
error: "Incorrect delimiter".into(),
msg: "Delimiter has to be one char".into(),
span: Some(d.span),
help: None,
inner: vec![],
});
} else {
let delimiter = match d.item.chars().next() {
Some(d) => d as u8,
None => unreachable!(),
};
writer.with_delimiter(delimiter)
writer.with_separator(delimiter)
}
}
};
writer.finish(df.as_mut()).map_err(|e| {
ShellError::GenericError(
"Error writing to file".into(),
e.to_string(),
Some(file_name.span),
None,
Vec::new(),
)
writer
.finish(df.as_mut())
.map_err(|e| ShellError::GenericError {
error: "Error writing to file".into(),
msg: e.to_string(),
span: Some(file_name.span),
help: None,
inner: vec![],
})?;
let file_value = Value::string(format!("saved {:?}", &file_name.item), file_name.span);

View File

@ -58,27 +58,23 @@ fn command(
let mut df = NuDataFrame::try_from_pipeline(input, call.head)?;
let file = File::create(&file_name.item).map_err(|e| {
ShellError::GenericError(
"Error with file name".into(),
e.to_string(),
Some(file_name.span),
None,
Vec::new(),
)
let file = File::create(&file_name.item).map_err(|e| ShellError::GenericError {
error: "Error with file name".into(),
msg: e.to_string(),
span: Some(file_name.span),
help: None,
inner: vec![],
})?;
let buf_writer = BufWriter::new(file);
JsonWriter::new(buf_writer)
.finish(df.as_mut())
.map_err(|e| {
ShellError::GenericError(
"Error saving file".into(),
e.to_string(),
Some(file_name.span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error saving file".into(),
msg: e.to_string(),
span: Some(file_name.span),
help: None,
inner: vec![],
})?;
let file_value = Value::string(format!("saved {:?}", &file_name.item), file_name.span);

View File

@ -58,24 +58,22 @@ fn command(
let mut df = NuDataFrame::try_from_pipeline(input, call.head)?;
let file = File::create(&file_name.item).map_err(|e| {
ShellError::GenericError(
"Error with file name".into(),
e.to_string(),
Some(file_name.span),
None,
Vec::new(),
)
let file = File::create(&file_name.item).map_err(|e| ShellError::GenericError {
error: "Error with file name".into(),
msg: e.to_string(),
span: Some(file_name.span),
help: None,
inner: vec![],
})?;
ParquetWriter::new(file).finish(df.as_mut()).map_err(|e| {
ShellError::GenericError(
"Error saving file".into(),
e.to_string(),
Some(file_name.span),
None,
Vec::new(),
)
ParquetWriter::new(file)
.finish(df.as_mut())
.map_err(|e| ShellError::GenericError {
error: "Error saving file".into(),
msg: e.to_string(),
span: Some(file_name.span),
help: None,
inner: vec![],
})?;
let file_value = Value::string(format!("saved {:?}", &file_name.item), file_name.span);

View File

@ -151,14 +151,12 @@ fn command_eager(
df.as_mut()
.with_column(series)
.map_err(|e| {
ShellError::GenericError(
"Error adding column to dataframe".into(),
e.to_string(),
Some(column_span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error adding column to dataframe".into(),
msg: e.to_string(),
span: Some(column_span),
help: None,
inner: vec![],
})
.map(|df| {
PipelineData::Value(

View File

@ -319,7 +319,7 @@ macro_rules! lazy_expr_command {
expr_command!(
ExprList,
"dfr implode",
"Aggregates a group to a Series",
"Aggregates a group to a Series.",
vec![Example {
description: "",
example: "",
@ -334,7 +334,7 @@ expr_command!(
expr_command!(
ExprAggGroups,
"dfr agg-groups",
"creates an agg_groups expression",
"Creates an agg_groups expression.",
vec![Example {
description: "",
example: "",
@ -349,7 +349,7 @@ expr_command!(
expr_command!(
ExprCount,
"dfr count",
"creates a count expression",
"Creates a count expression.",
vec![Example {
description: "",
example: "",
@ -364,7 +364,7 @@ expr_command!(
expr_command!(
ExprNot,
"dfr expr-not",
"creates a not expression",
"Creates a not expression.",
vec![Example {
description: "Creates a not expression",
example: "(dfr col a) > 2) | dfr expr-not",
@ -379,7 +379,7 @@ expr_command!(
lazy_expr_command!(
ExprMax,
"dfr max",
"Creates a max expression or aggregates columns to their max value",
"Creates a max expression or aggregates columns to their max value.",
vec![
Example {
description: "Max value from columns in a dataframe",
@ -424,7 +424,7 @@ lazy_expr_command!(
lazy_expr_command!(
ExprMin,
"dfr min",
"Creates a min expression or aggregates columns to their min value",
"Creates a min expression or aggregates columns to their min value.",
vec![
Example {
description: "Min value from columns in a dataframe",
@ -469,7 +469,7 @@ lazy_expr_command!(
lazy_expr_command!(
ExprSum,
"dfr sum",
"Creates a sum expression for an aggregation or aggregates columns to their sum value",
"Creates a sum expression for an aggregation or aggregates columns to their sum value.",
vec![
Example {
description: "Sums all columns in a dataframe",
@ -514,7 +514,7 @@ lazy_expr_command!(
lazy_expr_command!(
ExprMean,
"dfr mean",
"Creates a mean expression for an aggregation or aggregates columns to their mean value",
"Creates a mean expression for an aggregation or aggregates columns to their mean value.",
vec![
Example {
description: "Mean value from columns in a dataframe",
@ -559,7 +559,7 @@ lazy_expr_command!(
expr_command!(
ExprMedian,
"dfr median",
"Creates a median expression for an aggregation",
"Creates a median expression for an aggregation.",
vec![Example {
description: "Median aggregation for a group-by",
example: r#"[[a b]; [one 2] [one 4] [two 1]]
@ -590,7 +590,7 @@ expr_command!(
lazy_expr_command!(
ExprStd,
"dfr std",
"Creates a std expression for an aggregation of std value from columns in a dataframe",
"Creates a std expression for an aggregation of std value from columns in a dataframe.",
vec![
Example {
description: "Std value from columns in a dataframe",
@ -636,7 +636,7 @@ lazy_expr_command!(
lazy_expr_command!(
ExprVar,
"dfr var",
"Create a var expression for an aggregation",
"Create a var expression for an aggregation.",
vec![
Example {
description:

View File

@ -15,7 +15,7 @@ impl Command for ExprOtherwise {
}
fn usage(&self) -> &str {
"completes a when expression."
"Completes a when expression."
}
fn signature(&self) -> Signature {

View File

@ -125,13 +125,13 @@ impl Command for LazyAggregate {
let dtype = schema.get(name.as_str());
if matches!(dtype, Some(DataType::Object(..))) {
return Err(ShellError::GenericError(
"Object type column not supported for aggregation".into(),
format!("Column '{name}' is type Object"),
Some(call.head),
Some("Aggregations cannot be performed on Object type columns. Use dtype command to check column types".into()),
Vec::new(),
));
return Err(ShellError::GenericError {
error: "Object type column not supported for aggregation".into(),
msg: format!("Column '{name}' is type Object"),
span: Some(call.head),
help: Some("Aggregations cannot be performed on Object type columns. Use dtype command to check column types".into()),
inner: vec![],
});
}
}
}
@ -171,7 +171,7 @@ fn get_col_name(expr: &Expr) -> Option<String> {
| Expr::Slice { input: expr, .. }
| Expr::Cast { expr, .. }
| Expr::Sort { expr, .. }
| Expr::Take { expr, .. }
| Expr::Gather { expr, .. }
| Expr::SortBy { expr, .. }
| Expr::Exclude(expr, _)
| Expr::Alias(expr, _)
@ -189,6 +189,7 @@ fn get_col_name(expr: &Expr) -> Option<String> {
| Expr::RenameAlias { .. }
| Expr::Count
| Expr::Nth(_)
| Expr::SubPlan(_, _)
| Expr::Selector(_) => None,
}
}

View File

@ -16,7 +16,7 @@ impl Command for LazyFetch {
}
fn usage(&self) -> &str {
"collects the lazyframe to the selected rows."
"Collects the lazyframe to the selected rows."
}
fn signature(&self) -> Signature {
@ -67,14 +67,12 @@ impl Command for LazyFetch {
let eager: NuDataFrame = lazy
.into_polars()
.fetch(rows as usize)
.map_err(|e| {
ShellError::GenericError(
"Error fetching rows".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error fetching rows".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into();

View File

@ -17,7 +17,7 @@ impl Command for LazyFlatten {
}
fn usage(&self) -> &str {
"An alias for dfr explode"
"An alias for dfr explode."
}
fn signature(&self) -> Signature {

View File

@ -121,7 +121,7 @@ lazy_command!(
"dfr reverse",
"Reverses the LazyFrame",
vec![Example {
description: "Reverses the dataframe",
description: "Reverses the dataframe.",
example: "[[a b]; [6 2] [4 2] [2 2]] | dfr into-df | dfr reverse",
result: Some(
NuDataFrame::try_from_columns(vec![
@ -147,7 +147,7 @@ lazy_command!(
lazy_command!(
LazyCache,
"dfr cache",
"Caches operations in a new LazyFrame",
"Caches operations in a new LazyFrame.",
vec![Example {
description: "Caches the result into a new LazyFrame",
example: "[[a b]; [6 2] [4 2] [2 2]] | dfr into-df | dfr reverse | dfr cache",

View File

@ -16,7 +16,7 @@ impl Command for LazySortBy {
}
fn usage(&self) -> &str {
"sorts a lazy dataframe based on expression(s)."
"Sorts a lazy dataframe based on expression(s)."
}
fn signature(&self) -> Signature {
@ -118,13 +118,13 @@ impl Command for LazySortBy {
.get_flag::<Value>(engine_state, stack, "reverse")?
.expect("already checked and it exists")
.span();
return Err(ShellError::GenericError(
"Incorrect list size".into(),
"Size doesn't match expression list".into(),
Some(span),
None,
Vec::new(),
));
return Err(ShellError::GenericError {
error: "Incorrect list size".into(),
msg: "Size doesn't match expression list".into(),
span: Some(span),
help: None,
inner: vec![],
});
} else {
list
}

View File

@ -78,14 +78,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let bool = series.bool().map_err(|_| {
ShellError::GenericError(
"Error converting to bool".into(),
"all-false only works with series of type bool".into(),
Some(call.head),
None,
Vec::new(),
)
let bool = series.bool().map_err(|_| ShellError::GenericError {
error: "Error converting to bool".into(),
msg: "all-false only works with series of type bool".into(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let value = Value::bool(!bool.any(), call.head);

View File

@ -78,14 +78,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let bool = series.bool().map_err(|_| {
ShellError::GenericError(
"Error converting to bool".into(),
"all-false only works with series of type bool".into(),
Some(call.head),
None,
Vec::new(),
)
let bool = series.bool().map_err(|_| ShellError::GenericError {
error: "Error converting to bool".into(),
msg: "all-false only works with series of type bool".into(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let value = Value::bool(bool.all(), call.head);

View File

@ -8,6 +8,7 @@ use nu_protocol::{
Value,
};
use polars::prelude::{DataType, IntoSeries};
use polars_ops::prelude::{cum_max, cum_min, cum_sum};
enum CumType {
Min,
@ -21,13 +22,13 @@ impl CumType {
"min" => Ok(Self::Min),
"max" => Ok(Self::Max),
"sum" => Ok(Self::Sum),
_ => Err(ShellError::GenericError(
"Wrong operation".into(),
"Operation not valid for cumulative".into(),
Some(span),
Some("Allowed values: max, min, sum".into()),
Vec::new(),
)),
_ => Err(ShellError::GenericError {
error: "Wrong operation".into(),
msg: "Operation not valid for cumulative".into(),
span: Some(span),
help: Some("Allowed values: max, min, sum".into()),
inner: vec![],
}),
}
}
@ -108,21 +109,28 @@ fn command(
let series = df.as_series(call.head)?;
if let DataType::Object(_) = series.dtype() {
return Err(ShellError::GenericError(
"Found object series".into(),
"Series of type object cannot be used for cumulative operation".into(),
Some(call.head),
None,
Vec::new(),
));
return Err(ShellError::GenericError {
error: "Found object series".into(),
msg: "Series of type object cannot be used for cumulative operation".into(),
span: Some(call.head),
help: None,
inner: vec![],
});
}
let cum_type = CumType::from_str(&cum_type.item, cum_type.span)?;
let mut res = match cum_type {
CumType::Max => series.cummax(reverse),
CumType::Min => series.cummin(reverse),
CumType::Sum => series.cumsum(reverse),
};
CumType::Max => cum_max(&series, reverse),
CumType::Min => cum_min(&series, reverse),
CumType::Sum => cum_sum(&series, reverse),
}
.map_err(|e| ShellError::GenericError {
error: "Error creating cumulative".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let name = format!("{}_{}", series.name(), cum_type.to_str());
res.rename(&name);

View File

@ -68,14 +68,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.utf8().map_err(|e| {
ShellError::GenericError(
"Error casting to string".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.utf8().map_err(|e| ShellError::GenericError {
error: "Error casting to string".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = if not_exact {
@ -85,14 +83,12 @@ fn command(
};
let mut res = res
.map_err(|e| {
ShellError::GenericError(
"Error creating datetime".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error creating datetime".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into_series();

View File

@ -132,14 +132,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.utf8().map_err(|e| {
ShellError::GenericError(
"Error casting to string".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.utf8().map_err(|e| ShellError::GenericError {
error: "Error casting to string".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = if not_exact {
@ -162,14 +160,12 @@ fn command(
};
let mut res = res
.map_err(|e| {
ShellError::GenericError(
"Error creating datetime".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error creating datetime".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into_series();

View File

@ -65,14 +65,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to datetime type".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to datetime type".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = casted.day().into_series();

View File

@ -65,14 +65,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to datetime type".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to datetime type".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = casted.hour().into_series();

View File

@ -65,14 +65,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to datetime type".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to datetime type".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = casted.minute().into_series();

View File

@ -65,14 +65,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to datetime type".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to datetime type".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = casted.month().into_series();

View File

@ -65,14 +65,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to datetime type".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to datetime type".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = casted.nanosecond().into_series();

View File

@ -65,14 +65,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to datetime type".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to datetime type".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = casted.ordinal().into_series();

View File

@ -65,14 +65,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to datetime type".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to datetime type".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = casted.second().into_series();

View File

@ -65,14 +65,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to datetime type".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to datetime type".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = casted.week().into_series();

View File

@ -65,14 +65,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to datetime type".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to datetime type".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = casted.weekday().into_series();

View File

@ -65,14 +65,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to datetime type".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to datetime type".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = casted.year().into_series();

View File

@ -67,13 +67,13 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let columns = df.as_ref().get_column_names();
if columns.len() > 1 {
return Err(ShellError::GenericError(
"Error using as series".into(),
"dataframe has more than one column".into(),
Some(call.head),
None,
Vec::new(),
));
return Err(ShellError::GenericError {
error: "Error using as series".into(),
msg: "dataframe has more than one column".into(),
span: Some(call.head),
help: None,
inner: vec![],
});
}
match columns.first() {
@ -85,14 +85,12 @@ fn command(
.lazy()
.select(&[expression])
.collect()
.map_err(|err| {
ShellError::GenericError(
"Error creating index column".into(),
err.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|err| ShellError::GenericError {
error: "Error creating index column".into(),
msg: err.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let value = NuDataFrame::dataframe_into_value(res, call.head);

View File

@ -69,14 +69,12 @@ fn command(
let mut res = df
.as_series(call.head)?
.arg_unique()
.map_err(|e| {
ShellError::GenericError(
"Error extracting unique values".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error extracting unique values".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into_series();
res.rename("arg_unique");

View File

@ -86,36 +86,33 @@ fn command(
let indices = NuDataFrame::try_from_value(indices_value)?.as_series(indices_span)?;
let casted = match indices.dtype() {
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => {
indices.as_ref().cast(&DataType::UInt32).map_err(|e| {
ShellError::GenericError(
"Error casting indices".into(),
e.to_string(),
Some(indices_span),
None,
Vec::new(),
)
})
}
_ => Err(ShellError::GenericError(
"Incorrect type".into(),
"Series with incorrect type".into(),
Some(indices_span),
Some("Consider using a Series with type int type".into()),
Vec::new(),
)),
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => indices
.as_ref()
.cast(&DataType::UInt32)
.map_err(|e| ShellError::GenericError {
error: "Error casting indices".into(),
msg: e.to_string(),
span: Some(indices_span),
help: None,
inner: vec![],
}),
_ => Err(ShellError::GenericError {
error: "Incorrect type".into(),
msg: "Series with incorrect type".into(),
span: Some(indices_span),
help: Some("Consider using a Series with type int type".into()),
inner: vec![],
}),
}?;
let indices = casted
.u32()
.map_err(|e| {
ShellError::GenericError(
"Error casting indices".into(),
e.to_string(),
Some(indices_span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error casting indices".into(),
msg: e.to_string(),
span: Some(indices_span),
help: None,
inner: vec![],
})?
.into_iter()
.flatten();
@ -124,74 +121,67 @@ fn command(
let series = df.as_series(call.head)?;
let span = value.span();
let res = match value {
let res =
match value {
Value::Int { val, .. } => {
let chunked = series.i64().map_err(|e| {
ShellError::GenericError(
"Error casting to i64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
let chunked = series.i64().map_err(|e| ShellError::GenericError {
error: "Error casting to i64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
let res = chunked.set_at_idx(indices, Some(val)).map_err(|e| {
ShellError::GenericError(
"Error setting value".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
ShellError::GenericError {
error: "Error setting value".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}
})?;
NuDataFrame::try_from_series(vec![res.into_series()], call.head)
}
Value::Float { val, .. } => {
let chunked = series.f64().map_err(|e| {
ShellError::GenericError(
"Error casting to f64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
let chunked = series.f64().map_err(|e| ShellError::GenericError {
error: "Error casting to f64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
let res = chunked.set_at_idx(indices, Some(val)).map_err(|e| {
ShellError::GenericError(
"Error setting value".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
ShellError::GenericError {
error: "Error setting value".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}
})?;
NuDataFrame::try_from_series(vec![res.into_series()], call.head)
}
Value::String { val, .. } => {
let chunked = series.utf8().map_err(|e| {
ShellError::GenericError(
"Error casting to string".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
let chunked = series.utf8().map_err(|e| ShellError::GenericError {
error: "Error casting to string".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
let res = chunked
.set_at_idx(indices, Some(val.as_ref()))
.map_err(|e| {
ShellError::GenericError(
"Error setting value".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error setting value".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
let mut res = res.into_series();
@ -199,16 +189,16 @@ fn command(
NuDataFrame::try_from_series(vec![res.into_series()], call.head)
}
_ => Err(ShellError::GenericError(
"Incorrect value type".into(),
format!(
_ => Err(ShellError::GenericError {
error: "Incorrect value type".into(),
msg: format!(
"this value cannot be set in a series of type '{}'",
series.dtype()
),
Some(span),
None,
Vec::new(),
)),
span: Some(span),
help: None,
inner: vec![],
}),
};
res.map(|df| PipelineData::Value(NuDataFrame::into_value(df, call.head), None))

View File

@ -94,14 +94,12 @@ fn command(
let mut res = df
.as_ref()
.is_duplicated()
.map_err(|e| {
ShellError::GenericError(
"Error finding duplicates".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error finding duplicates".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into_series();

View File

@ -79,14 +79,12 @@ fn command(
let other = other_df.as_series(other_span)?;
let mut res = is_in(&df, &other)
.map_err(|e| {
ShellError::GenericError(
"Error finding in other".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error finding in other".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into_series();

View File

@ -93,14 +93,12 @@ fn command(
let mut res = df
.as_ref()
.is_unique()
.map_err(|e| {
ShellError::GenericError(
"Error finding unique values".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error finding unique values".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into_series();

View File

@ -68,14 +68,12 @@ fn command(
) -> Result<PipelineData, ShellError> {
let series = df.as_series(call.head)?;
let bool = series.bool().map_err(|e| {
ShellError::GenericError(
"Error inverting mask".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let bool = series.bool().map_err(|e| ShellError::GenericError {
error: "Error inverting mask".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = bool.not();

View File

@ -85,22 +85,20 @@ fn command(
let mask = NuDataFrame::try_from_value(mask_value)?.as_series(mask_span)?;
let bool_mask = match mask.dtype() {
DataType::Boolean => mask.bool().map_err(|e| {
ShellError::GenericError(
"Error casting to bool".into(),
e.to_string(),
Some(mask_span),
None,
Vec::new(),
)
DataType::Boolean => mask.bool().map_err(|e| ShellError::GenericError {
error: "Error casting to bool".into(),
msg: e.to_string(),
span: Some(mask_span),
help: None,
inner: vec![],
}),
_ => Err(ShellError::GenericError {
error: "Incorrect type".into(),
msg: "can only use bool series as mask".into(),
span: Some(mask_span),
help: None,
inner: vec![],
}),
_ => Err(ShellError::GenericError(
"Incorrect type".into(),
"can only use bool series as mask".into(),
Some(mask_span),
None,
Vec::new(),
)),
}?;
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
@ -108,70 +106,64 @@ fn command(
let span = value.span();
let res = match value {
Value::Int { val, .. } => {
let chunked = series.i64().map_err(|e| {
ShellError::GenericError(
"Error casting to i64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
let chunked = series.i64().map_err(|e| ShellError::GenericError {
error: "Error casting to i64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
let res = chunked.set(bool_mask, Some(val)).map_err(|e| {
ShellError::GenericError(
"Error setting value".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
let res = chunked
.set(bool_mask, Some(val))
.map_err(|e| ShellError::GenericError {
error: "Error setting value".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
NuDataFrame::try_from_series(vec![res.into_series()], call.head)
}
Value::Float { val, .. } => {
let chunked = series.f64().map_err(|e| {
ShellError::GenericError(
"Error casting to f64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
let chunked = series.f64().map_err(|e| ShellError::GenericError {
error: "Error casting to f64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
let res = chunked.set(bool_mask, Some(val)).map_err(|e| {
ShellError::GenericError(
"Error setting value".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
let res = chunked
.set(bool_mask, Some(val))
.map_err(|e| ShellError::GenericError {
error: "Error setting value".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
NuDataFrame::try_from_series(vec![res.into_series()], call.head)
}
Value::String { val, .. } => {
let chunked = series.utf8().map_err(|e| {
ShellError::GenericError(
"Error casting to string".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
let chunked = series.utf8().map_err(|e| ShellError::GenericError {
error: "Error casting to string".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
let res = chunked.set(bool_mask, Some(val.as_ref())).map_err(|e| {
ShellError::GenericError(
"Error setting value".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
ShellError::GenericError {
error: "Error setting value".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}
})?;
let mut res = res.into_series();
@ -179,16 +171,16 @@ fn command(
NuDataFrame::try_from_series(vec![res.into_series()], call.head)
}
_ => Err(ShellError::GenericError(
"Incorrect value type".into(),
format!(
_ => Err(ShellError::GenericError {
error: "Incorrect value type".into(),
msg: format!(
"this value cannot be set in a series of type '{}'",
series.dtype()
),
Some(span),
None,
Vec::new(),
)),
span: Some(span),
help: None,
inner: vec![],
}),
};
res.map(|df| PipelineData::Value(NuDataFrame::into_value(df, call.head), None))

View File

@ -83,14 +83,15 @@ fn command(
call: &Call,
df: NuDataFrame,
) -> Result<PipelineData, ShellError> {
let res = df.as_series(call.head)?.n_unique().map_err(|e| {
ShellError::GenericError(
"Error counting unique values".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let res = df
.as_series(call.head)?
.n_unique()
.map_err(|e| ShellError::GenericError {
error: "Error counting unique values".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let value = Value::int(res as i64, call.head);

View File

@ -23,13 +23,13 @@ impl RollType {
"max" => Ok(Self::Max),
"sum" => Ok(Self::Sum),
"mean" => Ok(Self::Mean),
_ => Err(ShellError::GenericError(
"Wrong operation".into(),
"Operation not valid for cumulative".into(),
Some(span),
Some("Allowed values: min, max, sum, mean".into()),
Vec::new(),
)),
_ => Err(ShellError::GenericError {
error: "Wrong operation".into(),
msg: "Operation not valid for cumulative".into(),
span: Some(span),
help: Some("Allowed values: min, max, sum, mean".into()),
inner: vec![],
}),
}
}
@ -129,13 +129,13 @@ fn command(
let series = df.as_series(call.head)?;
if let DataType::Object(_) = series.dtype() {
return Err(ShellError::GenericError(
"Found object series".into(),
"Series of type object cannot be used for rolling operation".into(),
Some(call.head),
None,
Vec::new(),
));
return Err(ShellError::GenericError {
error: "Found object series".into(),
msg: "Series of type object cannot be used for rolling operation".into(),
span: Some(call.head),
help: None,
inner: vec![],
});
}
let roll_type = RollType::from_str(&roll_type.item, roll_type.span)?;
@ -158,14 +158,12 @@ fn command(
RollType::Mean => series.rolling_mean(rolling_opts),
};
let mut res = res.map_err(|e| {
ShellError::GenericError(
"Error calculating rolling values".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let mut res = res.map_err(|e| ShellError::GenericError {
error: "Error calculating rolling values".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let name = format!("{}_{}", series.name(), roll_type.to_str());

View File

@ -9,6 +9,8 @@ use nu_protocol::{
Category, Example, PipelineData, ShellError, Signature, Span, SyntaxShape, Type, Value,
};
use polars_plan::prelude::lit;
#[derive(Clone)]
pub struct Shift;
@ -98,7 +100,7 @@ fn command_lazy(
let lazy: NuLazyFrame = match fill {
Some(fill) => {
let expr = NuExpression::try_from_value(fill)?.into_polars();
lazy.shift_and_fill(shift, expr).into()
lazy.shift_and_fill(lit(shift), expr).into()
}
None => lazy.shift(shift).into(),
};

View File

@ -78,25 +78,21 @@ fn command(
let other_df = NuDataFrame::try_from_value(other)?;
let other_series = other_df.as_series(other_span)?;
let other_chunked = other_series.utf8().map_err(|e| {
ShellError::GenericError(
"The concatenate only with string columns".into(),
e.to_string(),
Some(other_span),
None,
Vec::new(),
)
let other_chunked = other_series.utf8().map_err(|e| ShellError::GenericError {
error: "The concatenate only with string columns".into(),
msg: e.to_string(),
span: Some(other_span),
help: None,
inner: vec![],
})?;
let series = df.as_series(call.head)?;
let chunked = series.utf8().map_err(|e| {
ShellError::GenericError(
"The concatenate only with string columns".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let chunked = series.utf8().map_err(|e| ShellError::GenericError {
error: "The concatenate only with string columns".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let mut res = chunked.concat(other_chunked);

View File

@ -74,24 +74,22 @@ fn command(
let pattern: String = call.req(engine_state, stack, 0)?;
let series = df.as_series(call.head)?;
let chunked = series.utf8().map_err(|e| {
ShellError::GenericError(
"The contains command only with string columns".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let chunked = series.utf8().map_err(|e| ShellError::GenericError {
error: "The contains command only with string columns".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let res = chunked.contains(&pattern, false).map_err(|e| {
ShellError::GenericError(
"Error searching in series".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let res = chunked
.contains(&pattern, false)
.map_err(|e| ShellError::GenericError {
error: "Error searching in series".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
NuDataFrame::try_from_series(vec![res.into_series()], call.head)

View File

@ -86,24 +86,22 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let chunked = series.utf8().map_err(|e| {
ShellError::GenericError(
"Error conversion to string".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let chunked = series.utf8().map_err(|e| ShellError::GenericError {
error: "Error conversion to string".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let mut res = chunked.replace(&pattern, &replace).map_err(|e| {
ShellError::GenericError(
"Error finding pattern other".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let mut res = chunked
.replace(&pattern, &replace)
.map_err(|e| ShellError::GenericError {
error: "Error finding pattern other".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
res.rename(series.name());

View File

@ -86,24 +86,23 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let chunked = series.utf8().map_err(|e| {
ShellError::GenericError(
"Error conversion to string".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let chunked = series.utf8().map_err(|e| ShellError::GenericError {
error: "Error conversion to string".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
let mut res = chunked.replace_all(&pattern, &replace).map_err(|e| {
ShellError::GenericError(
"Error finding pattern other".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
let mut res =
chunked
.replace_all(&pattern, &replace)
.map_err(|e| ShellError::GenericError {
error: "Error finding pattern other".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?;
res.rename(series.name());

View File

@ -63,17 +63,15 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let chunked = series.utf8().map_err(|e| {
ShellError::GenericError(
"Error casting to string".into(),
e.to_string(),
Some(call.head),
Some("The str-lengths command can only be used with string columns".into()),
Vec::new(),
)
let chunked = series.utf8().map_err(|e| ShellError::GenericError {
error: "Error casting to string".into(),
msg: e.to_string(),
span: Some(call.head),
help: Some("The str-lengths command can only be used with string columns".into()),
inner: vec![],
})?;
let res = chunked.as_ref().str_lengths().into_series();
let res = chunked.as_ref().str_len_bytes().into_series();
NuDataFrame::try_from_series(vec![res], call.head)
.map(|df| PipelineData::Value(NuDataFrame::into_value(df, call.head), None))

View File

@ -75,25 +75,15 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let chunked = series.utf8().map_err(|e| {
ShellError::GenericError(
"Error casting to string".into(),
e.to_string(),
Some(call.head),
Some("The str-slice command can only be used with string columns".into()),
Vec::new(),
)
let chunked = series.utf8().map_err(|e| ShellError::GenericError {
error: "Error casting to string".into(),
msg: e.to_string(),
span: Some(call.head),
help: Some("The str-slice command can only be used with string columns".into()),
inner: vec![],
})?;
let mut res = chunked.str_slice(start, length).map_err(|e| {
ShellError::GenericError(
"Error slicing series".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
})?;
let mut res = chunked.str_slice(start, length);
res.rename(series.name());
NuDataFrame::try_from_series(vec![res.into_series()], call.head)

View File

@ -72,26 +72,22 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.datetime().map_err(|e| {
ShellError::GenericError(
"Error casting to date".into(),
e.to_string(),
Some(call.head),
Some("The str-slice command can only be used with string columns".into()),
Vec::new(),
)
let casted = series.datetime().map_err(|e| ShellError::GenericError {
error: "Error casting to date".into(),
msg: e.to_string(),
span: Some(call.head),
help: Some("The str-slice command can only be used with string columns".into()),
inner: vec![],
})?;
let res = casted
.strftime(&fmt)
.map_err(|e| {
ShellError::GenericError(
"Error formatting datetime".into(),
e.to_string(),
Some(call.head),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error formatting datetime".into(),
msg: e.to_string(),
span: Some(call.head),
help: None,
inner: vec![],
})?
.into_series();

View File

@ -67,14 +67,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.utf8().map_err(|e| {
ShellError::GenericError(
"Error casting to string".into(),
e.to_string(),
Some(call.head),
Some("The str-slice command can only be used with string columns".into()),
Vec::new(),
)
let casted = series.utf8().map_err(|e| ShellError::GenericError {
error: "Error casting to string".into(),
msg: e.to_string(),
span: Some(call.head),
help: Some("The str-slice command can only be used with string columns".into()),
inner: vec![],
})?;
let mut res = casted.to_lowercase();

View File

@ -71,14 +71,12 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let casted = series.utf8().map_err(|e| {
ShellError::GenericError(
"Error casting to string".into(),
e.to_string(),
Some(call.head),
Some("The str-slice command can only be used with string columns".into()),
Vec::new(),
)
let casted = series.utf8().map_err(|e| ShellError::GenericError {
error: "Error casting to string".into(),
msg: e.to_string(),
span: Some(call.head),
help: Some("The str-slice command can only be used with string columns".into()),
inner: vec![],
})?;
let mut res = casted.to_uppercase();

View File

@ -96,14 +96,12 @@ fn command_eager(
) -> Result<PipelineData, ShellError> {
let series = df.as_series(call.head)?;
let res = series.unique().map_err(|e| {
ShellError::GenericError(
"Error calculating unique values".into(),
e.to_string(),
Some(call.head),
Some("The str-slice command can only be used with string columns".into()),
Vec::new(),
)
let res = series.unique().map_err(|e| ShellError::GenericError {
error: "Error calculating unique values".into(),
msg: e.to_string(),
span: Some(call.head),
help: Some("The str-slice command can only be used with string columns".into()),
inner: vec![],
})?;
NuDataFrame::try_from_series(vec![res.into_series()], call.head)

View File

@ -70,14 +70,14 @@ fn command(
let df = NuDataFrame::try_from_pipeline(input, call.head)?;
let series = df.as_series(call.head)?;
let res = series.value_counts(false, false).map_err(|e| {
ShellError::GenericError(
"Error calculating value counts values".into(),
e.to_string(),
Some(call.head),
Some("The str-slice command can only be used with string columns".into()),
Vec::new(),
)
let res = series
.value_counts(false, false)
.map_err(|e| ShellError::GenericError {
error: "Error calculating value counts values".into(),
msg: e.to_string(),
span: Some(call.head),
help: Some("The str-slice command can only be used with string columns".into()),
inner: vec![],
})?;
Ok(PipelineData::Value(

View File

@ -68,13 +68,13 @@ pub(super) fn compute_between_series(
res.rename(&name);
NuDataFrame::series_to_value(res, operation_span)
}
Err(e) => Err(ShellError::GenericError(
"Division error".into(),
e.to_string(),
Some(right.span()),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Division error".into(),
msg: e.to_string(),
span: Some(right.span()),
help: None,
inner: vec![],
}),
}
}
Operator::Comparison(Comparison::Equal) => {
@ -119,13 +119,13 @@ pub(super) fn compute_between_series(
res.rename(&name);
NuDataFrame::series_to_value(res, operation_span)
}
_ => Err(ShellError::GenericError(
"Incompatible types".into(),
"unable to cast to boolean".into(),
Some(right.span()),
None,
Vec::new(),
)),
_ => Err(ShellError::GenericError {
error: "Incompatible types".into(),
msg: "unable to cast to boolean".into(),
span: Some(right.span()),
help: None,
inner: vec![],
}),
}
}
_ => Err(ShellError::IncompatibleParametersSingle {
@ -148,13 +148,13 @@ pub(super) fn compute_between_series(
res.rename(&name);
NuDataFrame::series_to_value(res, operation_span)
}
_ => Err(ShellError::GenericError(
"Incompatible types".into(),
"unable to cast to boolean".into(),
Some(right.span()),
None,
Vec::new(),
)),
_ => Err(ShellError::GenericError {
error: "Incompatible types".into(),
msg: "unable to cast to boolean".into(),
span: Some(right.span()),
help: None,
inner: vec![],
}),
}
}
_ => Err(ShellError::IncompatibleParametersSingle {
@ -186,14 +186,12 @@ where
F: Fn(&'s Series, &'s Series) -> Result<ChunkedArray<BooleanType>, PolarsError>,
{
let mut res = f(lhs, rhs)
.map_err(|e| {
ShellError::GenericError(
"Equality error".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Equality error".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?
.into_series();
@ -458,29 +456,29 @@ where
let casted = series.i64();
compute_casted_i64(casted, val, f, span)
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to i64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to i64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
DataType::Int64 => {
let casted = series.i64();
compute_casted_i64(casted, val, f, span)
}
_ => Err(ShellError::GenericError(
"Incorrect type".into(),
format!(
_ => Err(ShellError::GenericError {
error: "Incorrect type".into(),
msg: format!(
"Series of type {} can not be used for operations with an i64 value",
series.dtype()
),
Some(span),
None,
Vec::new(),
)),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
@ -499,13 +497,13 @@ where
let res = res.into_series();
NuDataFrame::series_to_value(res, span)
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to i64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to i64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
@ -522,29 +520,29 @@ where
let casted = series.f64();
compute_casted_f64(casted, val, f, span)
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to f64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to f64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
DataType::Float64 => {
let casted = series.f64();
compute_casted_f64(casted, val, f, span)
}
_ => Err(ShellError::GenericError(
"Incorrect type".into(),
format!(
_ => Err(ShellError::GenericError {
error: "Incorrect type".into(),
msg: format!(
"Series of type {} can not be used for operations with a float value",
series.dtype()
),
Some(span),
None,
Vec::new(),
)),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
@ -563,13 +561,13 @@ where
let res = res.into_series();
NuDataFrame::series_to_value(res, span)
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to f64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to f64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
@ -586,13 +584,13 @@ where
let casted = series.i64();
compare_casted_i64(casted, val, f, span)
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to f64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to f64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
DataType::Date => {
@ -607,29 +605,29 @@ where
.expect("already checked for casting");
compare_casted_i64(Ok(&casted), val, f, span)
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to f64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to f64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
DataType::Int64 => {
let casted = series.i64();
compare_casted_i64(casted, val, f, span)
}
_ => Err(ShellError::GenericError(
"Incorrect type".into(),
format!(
_ => Err(ShellError::GenericError {
error: "Incorrect type".into(),
msg: format!(
"Series of type {} can not be used for operations with an i64 value",
series.dtype()
),
Some(span),
None,
Vec::new(),
)),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
@ -648,13 +646,13 @@ where
let res = res.into_series();
NuDataFrame::series_to_value(res, span)
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to i64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to i64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
@ -671,29 +669,29 @@ where
let casted = series.f64();
compare_casted_f64(casted, val, f, span)
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to i64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to i64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
DataType::Float64 => {
let casted = series.f64();
compare_casted_f64(casted, val, f, span)
}
_ => Err(ShellError::GenericError(
"Incorrect type".into(),
format!(
_ => Err(ShellError::GenericError {
error: "Incorrect type".into(),
msg: format!(
"Series of type {} can not be used for operations with a float value",
series.dtype()
),
Some(span),
None,
Vec::new(),
)),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
@ -712,13 +710,13 @@ where
let res = res.into_series();
NuDataFrame::series_to_value(res, span)
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to f64".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to f64".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
@ -733,22 +731,22 @@ fn contains_series_pat(series: &Series, pat: &str, span: Span) -> Result<Value,
let res = res.into_series();
NuDataFrame::series_to_value(res, span)
}
Err(e) => Err(ShellError::GenericError(
"Error using contains".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Error using contains".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to string".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to string".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
@ -761,12 +759,12 @@ fn add_string_to_series(series: &Series, pat: &str, span: Span) -> Result<Value,
NuDataFrame::series_to_value(res, span)
}
Err(e) => Err(ShellError::GenericError(
"Unable to cast to string".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Unable to cast to string".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}

View File

@ -9,6 +9,7 @@ pub use operations::Axis;
use indexmap::map::IndexMap;
use nu_protocol::{did_you_mean, PipelineData, Record, ShellError, Span, Value};
use polars::prelude::{DataFrame, DataType, IntoLazy, LazyFrame, PolarsObject, Series};
use polars_arrow::util::total_ord::TotalEq;
use serde::{Deserialize, Serialize};
use std::{cmp::Ordering, fmt::Display, hash::Hasher};
@ -61,6 +62,12 @@ impl std::hash::Hash for DataFrameValue {
}
}
impl TotalEq for DataFrameValue {
fn tot_eq(&self, other: &Self) -> bool {
self == other
}
}
impl PolarsObject for DataFrameValue {
fn type_name() -> &'static str {
"object"
@ -124,13 +131,13 @@ impl NuDataFrame {
pub fn series_to_value(series: Series, span: Span) -> Result<Value, ShellError> {
match DataFrame::new(vec![series]) {
Ok(dataframe) => Ok(NuDataFrame::dataframe_into_value(dataframe, span)),
Err(e) => Err(ShellError::GenericError(
"Error creating dataframe".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)),
Err(e) => Err(ShellError::GenericError {
error: "Error creating dataframe".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
}),
}
}
@ -167,14 +174,12 @@ impl NuDataFrame {
}
pub fn try_from_series(columns: Vec<Series>, span: Span) -> Result<Self, ShellError> {
let dataframe = DataFrame::new(columns).map_err(|e| {
ShellError::GenericError(
"Error creating dataframe".into(),
format!("Unable to create DataFrame: {e}"),
Some(span),
None,
Vec::new(),
)
let dataframe = DataFrame::new(columns).map_err(|e| ShellError::GenericError {
error: "Error creating dataframe".into(),
msg: format!("Unable to create DataFrame: {e}"),
span: Some(span),
help: None,
inner: vec![],
})?;
Ok(Self::new(false, dataframe))
@ -288,17 +293,18 @@ impl NuDataFrame {
.collect::<Vec<String>>();
let option = did_you_mean(&possibilities, column).unwrap_or_else(|| column.to_string());
ShellError::DidYouMean(option, span)
ShellError::DidYouMean {
suggestion: option,
span,
}
})?;
let df = DataFrame::new(vec![s.clone()]).map_err(|e| {
ShellError::GenericError(
"Error creating dataframe".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
let df = DataFrame::new(vec![s.clone()]).map_err(|e| ShellError::GenericError {
error: "Error creating dataframe".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
Ok(Self {
@ -313,19 +319,19 @@ impl NuDataFrame {
pub fn as_series(&self, span: Span) -> Result<Series, ShellError> {
if !self.is_series() {
return Err(ShellError::GenericError(
"Error using as series".into(),
"dataframe has more than one column".into(),
Some(span),
None,
Vec::new(),
));
return Err(ShellError::GenericError {
error: "Error using as series".into(),
msg: "dataframe has more than one column".into(),
span: Some(span),
help: None,
inner: vec![],
});
}
let series = self
.df
.get_columns()
.get(0)
.first()
.expect("We have already checked that the width is 1");
Ok(series.clone())

View File

@ -24,10 +24,10 @@ impl NuDataFrame {
match right {
Value::CustomValue { val: rhs, .. } => {
let rhs = rhs.as_any().downcast_ref::<NuDataFrame>().ok_or_else(|| {
ShellError::DowncastNotPossible(
"Unable to create dataframe".to_string(),
rhs_span,
)
ShellError::DowncastNotPossible {
msg: "Unable to create dataframe".to_string(),
span: rhs_span,
}
})?;
match (self.is_series(), rhs.is_series()) {
@ -135,14 +135,12 @@ impl NuDataFrame {
})
.collect::<Vec<Series>>();
let df_new = DataFrame::new(new_cols).map_err(|e| {
ShellError::GenericError(
"Error creating dataframe".into(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
let df_new = DataFrame::new(new_cols).map_err(|e| ShellError::GenericError {
error: "Error creating dataframe".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})?;
Ok(NuDataFrame::new(false, df_new))
@ -183,26 +181,24 @@ impl NuDataFrame {
match res {
Ok(s) => Ok(s.clone()),
Err(e) => Err({
ShellError::GenericError(
"Error appending dataframe".into(),
format!("Unable to append: {e}"),
Some(span),
None,
Vec::new(),
)
ShellError::GenericError {
error: "Error appending dataframe".into(),
msg: format!("Unable to append: {e}"),
span: Some(span),
help: None,
inner: vec![],
}
}),
}
})
.collect::<Result<Vec<Series>, ShellError>>()?;
let df_new = DataFrame::new(new_cols).map_err(|e| {
ShellError::GenericError(
"Error appending dataframe".into(),
format!("Unable to append dataframes: {e}"),
Some(span),
None,
Vec::new(),
)
let df_new = DataFrame::new(new_cols).map_err(|e| ShellError::GenericError {
error: "Error appending dataframe".into(),
msg: format!("Unable to append dataframes: {e}"),
span: Some(span),
help: None,
inner: vec![],
})?;
Ok(NuDataFrame::new(false, df_new))

View File

@ -57,7 +57,10 @@ fn compute_with_value(
match right {
Value::CustomValue { val: rhs, .. } => {
let rhs = rhs.as_any().downcast_ref::<NuExpression>().ok_or_else(|| {
ShellError::DowncastNotPossible("Unable to create expression".to_string(), rhs_span)
ShellError::DowncastNotPossible {
msg: "Unable to create expression".into(),
span: rhs_span,
}
})?;
match rhs.as_ref() {

View File

@ -299,7 +299,11 @@ pub fn expr_to_value(expr: &Expr, span: Span) -> Result<Value, ShellError> {
},
span,
)),
Expr::Take { expr, idx } => Ok(Value::record(
Expr::Gather {
expr,
idx,
returns_scalar: _,
} => Ok(Value::record(
record! {
"expr" => expr_to_value(expr.as_ref(), span)?,
"idx" => expr_to_value(idx.as_ref(), span)?,
@ -401,7 +405,6 @@ pub fn expr_to_value(expr: &Expr, span: Span) -> Result<Value, ShellError> {
Expr::Window {
function,
partition_by,
order_by,
options,
} => {
let partition_by: Result<Vec<Value>, ShellError> = partition_by
@ -409,22 +412,21 @@ pub fn expr_to_value(expr: &Expr, span: Span) -> Result<Value, ShellError> {
.map(|e| expr_to_value(e, span))
.collect();
let order_by = order_by
.as_ref()
.map(|e| expr_to_value(e.as_ref(), span))
.transpose()?
.unwrap_or_else(|| Value::nothing(span));
Ok(Value::record(
record! {
"function" => expr_to_value(function, span)?,
"partition_by" => Value::list(partition_by?, span),
"order_by" => order_by,
"options" => Value::string(format!("{options:?}"), span),
},
span,
))
}
Expr::SubPlan(_, _) => Err(ShellError::UnsupportedInput {
msg: "Expressions of type SubPlan are not yet supported".to_string(),
input: format!("Expression is {expr:?}"),
msg_span: span,
input_span: Span::unknown(),
}),
// the parameter polars_plan::dsl::selector::Selector is not publicly exposed.
// I am not sure what we can meaningfully do with this at this time.
Expr::Selector(_) => Err(ShellError::UnsupportedInput {

View File

@ -104,14 +104,12 @@ impl NuLazyFrame {
self.lazy
.expect("No empty lazy for collect")
.collect()
.map_err(|e| {
ShellError::GenericError(
"Error collecting lazy frame".to_string(),
e.to_string(),
Some(span),
None,
Vec::new(),
)
.map_err(|e| ShellError::GenericError {
error: "Error collecting lazy frame".into(),
msg: e.to_string(),
span: Some(span),
help: None,
inner: vec![],
})
.map(|df| NuDataFrame {
df,

View File

@ -11,15 +11,13 @@ pub(crate) fn convert_columns(
) -> Result<(Vec<Spanned<String>>, Span), ShellError> {
// First column span
let mut col_span = columns
.get(0)
.ok_or_else(|| {
ShellError::GenericError(
"Empty column list".into(),
"Empty list found for command".into(),
Some(span),
None,
Vec::new(),
)
.first()
.ok_or_else(|| ShellError::GenericError {
error: "Empty column list".into(),
msg: "Empty list found for command".into(),
span: Some(span),
help: None,
inner: vec![],
})
.map(|v| v.span())?;
@ -32,13 +30,13 @@ pub(crate) fn convert_columns(
col_span = span_join(&[col_span, span]);
Ok(Spanned { item: val, span })
}
_ => Err(ShellError::GenericError(
"Incorrect column format".into(),
"Only string as column name".into(),
Some(span),
None,
Vec::new(),
)),
_ => Err(ShellError::GenericError {
error: "Incorrect column format".into(),
msg: "Only string as column name".into(),
span: Some(span),
help: None,
inner: vec![],
}),
}
})
.collect::<Result<Vec<Spanned<String>>, _>>()?;
@ -54,15 +52,13 @@ pub(crate) fn convert_columns_string(
) -> Result<(Vec<String>, Span), ShellError> {
// First column span
let mut col_span = columns
.get(0)
.ok_or_else(|| {
ShellError::GenericError(
"Empty column list".into(),
"Empty list found for command".into(),
Some(span),
None,
Vec::new(),
)
.first()
.ok_or_else(|| ShellError::GenericError {
error: "Empty column list".into(),
msg: "Empty list found for command".into(),
span: Some(span),
help: None,
inner: vec![],
})
.map(|v| v.span())?;
@ -75,13 +71,13 @@ pub(crate) fn convert_columns_string(
col_span = span_join(&[col_span, span]);
Ok(val)
}
_ => Err(ShellError::GenericError(
"Incorrect column format".into(),
"Only string as column name".into(),
Some(span),
None,
Vec::new(),
)),
_ => Err(ShellError::GenericError {
error: "Incorrect column format".into(),
msg: "Only string as column name".into(),
span: Some(span),
help: None,
inner: vec![],
}),
}
})
.collect::<Result<Vec<String>, _>>()?;

View File

@ -5,7 +5,7 @@ edition = "2021"
license = "MIT"
name = "nu-cmd-extra"
repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cmd-extra"
version = "0.87.0"
version = "0.88.1"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -13,11 +13,11 @@ version = "0.87.0"
bench = false
[dependencies]
nu-engine = { path = "../nu-engine", version = "0.87.0" }
nu-parser = { path = "../nu-parser", version = "0.87.0" }
nu-protocol = { path = "../nu-protocol", version = "0.87.0" }
nu-cmd-base = { path = "../nu-cmd-base", version = "0.87.0" }
nu-utils = { path = "../nu-utils", version = "0.87.0" }
nu-engine = { path = "../nu-engine", version = "0.88.1" }
nu-parser = { path = "../nu-parser", version = "0.88.1" }
nu-protocol = { path = "../nu-protocol", version = "0.88.1" }
nu-cmd-base = { path = "../nu-cmd-base", version = "0.88.1" }
nu-utils = { path = "../nu-utils", version = "0.88.1" }
# Potential dependencies for extras
heck = "0.4.1"
@ -27,8 +27,8 @@ nu-ansi-term = "0.49.0"
fancy-regex = "0.11.0"
rust-embed = "8.0.0"
serde = "1.0.164"
nu-pretty-hex = { version = "0.87.0", path = "../nu-pretty-hex" }
nu-json = { version = "0.87.0", path = "../nu-json" }
nu-pretty-hex = { version = "0.88.1", path = "../nu-pretty-hex" }
nu-json = { version = "0.88.1", path = "../nu-json" }
serde_urlencoded = "0.7.1"
htmlescape = "0.3.1"
@ -37,6 +37,6 @@ extra = ["default"]
default = []
[dev-dependencies]
nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.87.0" }
nu-command = { path = "../nu-command", version = "0.87.0" }
nu-test-support = { path = "../nu-test-support", version = "0.87.0" }
nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.88.1" }
nu-command = { path = "../nu-command", version = "0.88.1" }
nu-test-support = { path = "../nu-test-support", version = "0.88.1" }

View File

@ -108,15 +108,15 @@ where
match rotate_result {
Ok(val) => Value::int(val, span),
Err(_) => Value::error(
ShellError::GenericError(
"Rotate left result beyond the range of 64 bit signed number".to_string(),
format!(
ShellError::GenericError {
error: "Rotate left result beyond the range of 64 bit signed number".into(),
msg: format!(
"{val} of the specified number of bytes rotate left {bits} bits exceed limit"
),
Some(span),
None,
Vec::new(),
),
span: Some(span),
help: None,
inner: vec![],
},
span,
),
}

View File

@ -112,15 +112,15 @@ where
match rotate_result {
Ok(val) => Value::int(val, span),
Err(_) => Value::error(
ShellError::GenericError(
"Rotate right result beyond the range of 64 bit signed number".to_string(),
format!(
ShellError::GenericError {
error: "Rotate right result beyond the range of 64 bit signed number".into(),
msg: format!(
"{val} of the specified number of bytes rotate right {bits} bits exceed limit"
),
Some(span),
None,
Vec::new(),
),
span: Some(span),
help: None,
inner: vec![],
},
span,
),
}

Some files were not shown because too many files have changed in this diff Show More