Compare commits

...

73 Commits

Author SHA1 Message Date
f4136aa3f4 Add pipeline span to metadata (#16014)
# Description

This PR makes the span of a pipeline accessible through `metadata`,
meaning it's possible to get the span of a pipeline without collecting
it.

Examples:
```nushell
ls | metadata
# => ╭────────┬────────────────────╮
# => │        │ ╭───────┬────────╮ │
# => │ span   │ │ start │ 170218 │ │
# => │        │ │ end   │ 170220 │ │
# => │        │ ╰───────┴────────╯ │
# => │ source │ ls                 │
# => ╰────────┴────────────────────╯
```

```nushell
ls | metadata access {|meta|
  error make {msg: "error", label: {text: "here", span: $meta.span}}
}
# => Error:   × error
# =>    ╭─[entry #7:1:1]
# =>  1 │ ls | metadata access {|meta|
# =>    · ─┬
# =>    ·  ╰── here
# =>  2 │   error make {msg: "error", label: {text: "here", span: $meta.span}}
# =>    ╰────
```

Here's an example that wouldn't be possible before, since you would have
to use `metadata $in` to get the span, collecting the (infinite) stream

```nushell
generate {|x=0| {out: 0, next: 0} } | metadata access {|meta|
  # do whatever with stream
  error make {msg: "error", label: {text: "here", span: $meta.span}}
}
# => Error:   × error
# =>    ╭─[entry #16:1:1]
# =>  1 │ generate {|x=0| {out: 0, next: 0} } | metadata access {|meta|
# =>    · ────┬───
# =>    ·     ╰── here
# =>  2 │   # do whatever with stream
# =>    ╰────
```

I haven't done the tests or anything yet since I'm not sure how we feel
about having this as part of the normal metadata, rather than a new
command like `metadata span` or something. We could also have a
`metadata access` like functionality for that with an optional closure
argument potentially.

# User-Facing Changes

* The span of a pipeline is now available through `metadata` and
`metadata access` without collecting a stream.

# Tests + Formatting

TODO

# After Submitting

N/A
2025-06-30 23:17:43 +02:00
082e8d0de8 update rust version 1.86.0 (#16077)
# Description

This PR updates nushell to use rust version 1.86.0
2025-06-30 15:28:38 +02:00
9da0f41ebb Fix easy clippy lints from latest stable (#16053)
1.88.0 was released today, clippy now lints (machine-applicable)
against:
- format strings with empty braces that could be inlined
  - easy win
- `manual_abs_diff`
- returning of a stored result of the last expression.
  - this can be somewhat contentious but touched only a few places
2025-06-29 17:37:17 +02:00
372d576846 fix(std/help): add debug -v to string default parameters (#16063)
# Description
Added `debug -v` in case the default parameter is a string so that it
will be not be printed literally:
- Before
```nu
  --char: <string> (default:  )
```
```nu
  --char: <string> (default:
)
```
```nu
  --char: <string> (default: abc)
```
- After
```nu
  --char: <string> (default: " ")
```
```nu
  --char: <string> (default: "\n")
```
```nu
  --char: <string> (default: "abc")
```
Other types like `int` remain unaffected.
# User-Facing Changes

# Tests + Formatting

# After Submitting
2025-06-29 17:35:25 +02:00
c795f16143 If save-ing with non-existing parent dir, return directory_not_found (#15961) 2025-06-29 17:15:15 +02:00
a4a3c514ba Bump strip-ansi-escapes to deduplicate vte (#16054)
Updating here deduplicates `vte` which is also depended on by `ansitok`
from the `tabled`/zhiburt-cinematic-universe
2025-06-26 23:31:07 +02:00
5478ec44bb to <format>: preserve round float numbers' type (#16016)
- fixes #16011

# Description
`Display` implementation for `f64` omits the decimal part for round
numbers, and by using it we did the same.
This affected:
- conversions to delimited formats: `csv`, `tsv`
- textual formats: `html`, `md`, `text`
- pretty printed `json` (`--raw` was unaffected)
- how single float values are displayed in the REPL

> [!TIP]
> This PR fixes our existing json pretty printing implementation.
> We can likely switch to using serde_json's impl using its
PrettyFormatter which allows arbitrary indent strings.

# User-Facing Changes
- Round trips through `csv`, `tsv`, and `json` preserve the type of
round floats.
- It's always clear whether a number is an integer or a float in the
REPL
  ```nushell
  4 / 2
  # => 2  # before: is this an int or a float?

  4 / 2
  # => 2.0  # after: clearly a float
  ``` 

# Tests + Formatting
Adjusted tests for the new behavior.

- 🟢 toolkit fmt
- 🟢 toolkit clippy
- 🟢 toolkit test
- 🟢 toolkit test stdlib

# After Submitting
N/A

---------

Co-authored-by: Bahex <17417311+Bahex@users.noreply.github.com>
2025-06-26 15:15:19 -05:00
6902bbe547 Add tango folder to .gitignore (#16052)
This is the folder used by `toolkit.nu` when invoking the tango commands
2025-06-26 21:43:07 +02:00
4e5da8cd91 default config: add note for figuring out datetime escape sequences (#16051)
# Description

There was no hint as to what datetime escape sequences are supported,
previously. Looked into the source code to figure this out, which is not
great ux hehehe

# User-Facing Changes

# Tests + Formatting


# After Submitting


---------

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
2025-06-26 21:42:06 +02:00
d248451428 Update which from 7.0.3 to 8.0.0 (#16045)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
This simply updates the `which` dependency from 7.0.3 to 8.0.0, with no
code changes. See
https://github.com/harryfei/which-rs/releases/tag/8.0.0 for release
notes.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

N/A

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

Tested with `cargo test --workspace` and `cargo run -- -c "use
toolkit.nu; toolkit test stdlib"`.

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->

N/A
2025-06-25 19:24:31 -05:00
3e758e899f update nushell to latest reedline e4221b9 (#16044)
# Description

This PR just updates nushell to the latest reedline commit e4221b9 so we
can dogfood the most recent changes.
2025-06-25 23:49:26 +02:00
f69a812055 fix(hooks): updating $env.config now correctly updates config state. (#16021)
- fixes #14946

- related #15227
- > [When I run nushell with the hook, the hook itself works as
expected, correctly detects system theme and changes
$env.config.color_config. However, it seems that the change to
$env.config.color_config is not propagated outside the
hook](https://github.com/nushell/nushell/issues/15227#issuecomment-2695287318)
- > [But it suffers from the same problem - modifications made to the
$env.config variable are not visible outside of the hook (which I'm not
sure if is correct behavior or
bug).](https://github.com/nushell/nushell/issues/15227#issuecomment-2695741542)
- > [I also managed to get it working with def --env, but there was one
more issue, I had to change $env.config.hooks.pre_prompt = [{
switch_theme }] into $env.config.hooks.pre_execution = ([ switch_theme
])](https://github.com/nushell/nushell/issues/15227#issuecomment-2704537565)
(having to use a string hook rather than a closure)

- related #11082
  > Might be possible solve or at least mitigate using a similar method

# Description

Recently realized that changes made to `$env.config` in closure hooks
don't take effect whereas string hooks don't have that problem.

After some investigation:
- Hooks' environment was not preserved prior to #5982 >
[2309601](2309601dd4/crates/nu-cli/src/repl.rs (L823-L840))
- `redirect_env` which properly updates the config state was implemented
afterwards in #6355 >
[ea8b0e8](ea8b0e8a1d/crates/nu-engine/src/eval.rs (L174-L190))

Simply using `nu_engine::eval::redirect_env` for the environment update
was enough to fix the issue.

# User-Facing Changes
Hooks can update `$env.config` and the configuration change will work as
expected.

# Tests + Formatting

- 🟢 toolkit fmt
- 🟢 toolkit clippy
- 🟢 toolkit test
- 🟢 toolkit test stdlib

# After Submitting
N/A

Co-authored-by: Bahex <17417311+Bahex@users.noreply.github.com>
2025-06-25 23:22:43 +02:00
6fba4b409e Add backtick code formatting to help (#15892)
# Description
Adds formatting for code in backticks in `help` output. If it's possible
to highlight syntax (`nu-highlight` is available and there's no invalid
syntax) then it's highlighted. If the syntax is invalid or not an
internal command, then it's dimmed and italicized. like some of the
output from `std/help`. If `use_ansi_coloring` is `false`, then we leave
the backticks alone. Here's a couple examples:


![image](https://github.com/user-attachments/assets/57eed1dd-b38c-48ef-92c6-3f805392487c)


![image](https://github.com/user-attachments/assets/a0efa0d7-fc11-4702-973b-a0b448c383e0)

(note on this one: usually we can highlight partial commands, like `get`
in the `select` help page which is invalid according to `nu-check` but
is still properly highlighted, however `where` is special cased and just
typing `where` with no row condition is highlighted with the garbage
style so `where` alone isn't highlighted here)

![image](https://github.com/user-attachments/assets/28c110c9-16c4-4890-bc74-6de0f2e6d1b8)

here's the `where` page with `$env.config.use_ansi_coloring = false`:

![image](https://github.com/user-attachments/assets/57871cc8-d509-4719-9dd4-e6f24f9d891c)


Technically, some syntax is valid but isn't really "Nushell code". For
example, the `select` help page has a line that says "Select just the
\`name\` column". If you just type `name` in the REPL, Nushell treats it
as an external command, but for the purposes of highlighted we actually
want this to fall back to the generic dimmed/italic style. This is
accomplished by temporarily setting the `shape_external` and
`shape_externalarg` color config to the generic/fallback style, and then
restoring the color config after highlighting. This is a bit hack-ish
but it seems to work pretty well.


# User-Facing Changes

- `help` command now supports code backtick formatting. Code will be
highlighted using `nu-highlight` if possible, otherwise it will fall
back to a generic format.
- Adds `--reject-garbage` flag to `nu-highlight` which will return an
error on invalid syntax (which would otherwise be highlighted with
`$env.config.color_config.shape_garbage`)

# Tests + Formatting

Added tests for the regex. I don't think tests for the actual
highlighting are very necessary since the failure mode is graceful and
it would be difficult to meaningfully test.

# After Submitting

N/A

---------

Co-authored-by: Piepmatz <git+github@cptpiepmatz.de>
2025-06-25 21:26:52 +02:00
cb7ac9199d Stream lazy default output (#15955)
It was brought up in the Discord that `default { open -r foo.txt }`
results in a string instead of streaming output. This changes `default`
such that closures now stream when given simple input.

# Description
If the value isn't expected to be cached, `default` just runs the
closure without caching the value, which allows its output to be
streamed

# User-Facing Changes


# Tests + Formatting
👍 

# After Submitting
2025-06-24 19:17:33 -04:00
a6b8e2f95c Update the behaviour how paths are interpreted in start (#16033)
Closes: https://github.com/nushell/nushell/issues/13127

# Description

This PR updates the behaviour of `start` in the following ways:
Instead of joining the path with CWD, we expand the path.

Behaviour on `origin/main`:
```
nushell> ls ~/nushell-test
test.txt

nushell> start ~/nushell-test/test.txt
Error:   × Cannot find file or URL: ~/nushell-test/test.txt
...
help: Ensure the path or URL is correct and try again.
```

Behaviour in this PR:
```
nushell> ls ~/nushell-test
test.txt

nushell> start ~/nushell-test/test.txt
<opens text editor>
```

# User-Facing Changes

`start` now treats the input path differently. This is a breaking
change, I believe. Although I'm not sure how breaking it would be in the
perspective of the user.

# Tests + Formatting

I've manually tested this. The test suite for `start` is broken. And
even if I fix it, I'm not sure how to test it.
I'll need to override the default command list for `start` in the
sandbox for testing.

# After Submitting

I don't think the documentation needs to be updated.
2025-06-24 17:29:10 -05:00
0b202d55f0 Add only command to std-rfc/iter (#16015)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

This PR adds the `only` command to `std-rfc/iter`, which is a command I
wrote a while ago that I've found so useful that I think it could have a
place in the standard library. It acts similarly to `get 0`, but ensures
that the value actually exists, and there aren't additional values. I
find this most useful when chained with `where`, when you want to be
certain that no additional elements are accidentally selected when you
only mean to get a single element.

I'll copy the help page here for additional explanation:

> Get the only element of a list or table, ensuring it exists and there
are no extra elements.
> 
> Similar to `first` with no arguments, but errors if there are no
additional
> items when there should only be one item. This can help avoid issues
when more
> than one row than expected matches some criteria.
> 
> This command is useful when chained with `where` to ensure that only
one row
> meets the given condition.
> 
> If a cell path is provided as an argument, it will be accessed after
the first
> element. For example, `only foo` is roughly equivalent to `get 0.foo`,
with
> the guarantee that there are no additional elements.
> 
> Note that this command currently collects streams.

> Examples:
>  
> Get the only item in a list, ensuring it exists and there's no
additional items
> ```nushell
> [5] | only
> # => 5
> ```
> 
> Get the `name` column of the only row in a table
> ```nushell
> [{name: foo, id: 5}] | only name
> # => foo
> ```
> 
> Get the modification time of the file named foo.txt
> ```nushell
> ls | where name == "foo.txt" | only modified
> ```

Here's some additional examples showing the errors:

![image](https://github.com/user-attachments/assets/d5e6f202-db52-42e4-a2ba-fb7c4f1d530a)


![image](https://github.com/user-attachments/assets/b080da2a-7aff-48a9-a523-55c638fdcce3)

Most of the time I chain this with a simple `where`, but here's a couple
other real world examples of how I've used this:

[With `parse`, which outputs a
table](https://git.ikl.sh/132ikl/dotfiles/src/branch/main/.scripts/manage-nu#L53):
```nushell
let commit = $selection | parse "{start}.g{commit}-{end}" | only commit
```

[Ensuring that only one row in a table has a name that ends with a
certain
suffix](https://git.ikl.sh/132ikl/dotfiles/src/branch/main/.scripts/btconnect):
```nushell
$devices | where ($chosen_name ends-with $it.name) | only
```


Unfortunately to get these nice errors I had to collect the stream (and
I think the errors are more useful for this). This should be to be
mitigated with (something like) #16014.


Putting this in `std/iter` might be pushing it, but it seems *just*
close enough that I can't really justify putting it in a different/new
module.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
* Adds the `only` command to `std-rfc/iter`, which can be used to ensure
that a table or list only has a single element.

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

Added a few tests for `only` including error cases

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
N/A

---------

Co-authored-by: Bahex <Bahex@users.noreply.github.com>
2025-06-23 16:29:58 -05:00
e88a6bff60 polars 0.49 upgrade (#16031)
# Description
Polars 0.49 upgrade

Co-authored-by: Jack Wright <jack.wright@nike.com>
2025-06-23 16:17:39 -05:00
a234e6ff51 feat(std/help): add is_const information (#16032)
# Description

I wanted to know if `version` is a const command and thought that it
would be in the "This command" section but it wasn't, so I added it.
```
→ help version
Display Nu version, and its build configuration.

Category: core

This command:
 Creates scope         | 
 Is built-in           | 
 Is const              | 
 Is a subcommand       | 
 Is a part of a plugin | 
 Is a custom command   | 
 Is a keyword          | 
```
2025-06-23 23:22:58 +03:00
ae0cf8780d fix(random dice): gracefully handle --sides 0 using NonZeroUsize (#16001) 2025-06-23 14:47:50 +02:00
680a2fa2aa Add loongarch64-unknown-linux-musl build target (#16020) 2025-06-23 06:22:25 +08:00
70277cc2ba fix(std/help): collect windows --help output for gui programs (#16019)
# Description
Adding to #15962, I have realized that there are windows gui programs
like `prismlauncher` or `firefox` that do accept the `--help` flag but
won't output on the terminal unless `collect`ed, so now it collects the
output on windows.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2025-06-21 20:54:31 -05:00
574106bc03 allow update cells to work on single records (#16018)
# Description
The `update cells` command used to "work" on records by converting them
into single-row tables in the past, but strengthened input type checking
made it so that no longer worked. This commit introduces correct record
-> record functionality.

# User-Facing Changes
Users can now pipe records into `update cells`. An example inspired by a
conversation in the Discord:
```nushell
> version | update cells { split row ', ' } -c [features, installed_plugins]
╭────────────────────┬──────────────────────────────────────────╮
│ version            │ 0.105.2                                  │
│ major              │ 0                                        │
│ minor              │ 105                                      │
│ patch              │ 2                                        │
│ branch             │ update-cells-record                      │
│ commit_hash        │ 4f7e9aac62 │
│ build_os           │ macos-x86_64                             │
│ build_target       │ x86_64-apple-darwin                      │
│ rust_version       │ rustc 1.85.1 (4eb161250 2025-03-15)      │
│ rust_channel       │ 1.85.1-x86_64-apple-darwin               │
│ cargo_version      │ cargo 1.85.1 (d73d2caf9 2024-12-31)      │
│ build_time         │ 2025-06-21 12:02:06 -04:00               │
│ build_rust_channel │ debug                                    │
│ allocator          │ standard                                 │
│                    │ ╭───┬───────────────╮                    │
│ features           │ │ 0 │ default       │                    │
│                    │ │ 1 │ plugin        │                    │
│                    │ │ 2 │ rustls-tls    │                    │
│                    │ │ 3 │ sqlite        │                    │
│                    │ │ 4 │ trash-support │                    │
│                    │ ╰───┴───────────────╯                    │
│                    │ ╭───┬─────────────────╮                  │
│ installed_plugins  │ │ 0 │ formats 0.104.0 │                  │
│                    │ │ 1 │ polars 0.104.0  │                  │
│                    │ │ 2 │ query 0.104.0   │                  │
│                    │ │ 3 │ todotxt 0.3.0   │                  │
│                    │ ╰───┴─────────────────╯                  │
╰────────────────────┴──────────────────────────────────────────╯
```

# Tests + Formatting
👍. Let me know if more tests besides the new example are needed.

# After Submitting
2025-06-21 20:53:45 -05:00
2a8364d259 drop nth command supports spreadable arguments (#15897)
##  Improve `drop nth` command to support spreadable arguments

### Summary

This PR updates the `drop nth` command to support **spreadable
arguments** in a way consistent with other commands like `which`,
enabling:

```nu
[1 2 3 4 5] | drop nth 0 2 4
```

### What's Changed

* **Previously**: only a single index or a single range was accepted as
the first argument, with rest arguments ignored for ranges.

* **Now**: the command accepts any combination of:

  * Integers: to drop individual rows
  * Ranges: to drop slices of rows
  * Unbounded ranges: like `3..`, to drop from index onward

Example:

```nu
[one two three four five six] | drop nth 0 2 4..5
# drops "one", "three", "five", and "six"
```

### Test 

Manual Test:

![nu-dron_n](https://github.com/user-attachments/assets/02f3988c-ac02-4245-967c-16a9604be406)


### Notes

As per feedback:

* We **only collect the list of indices** to drop, not the input stream.
* Unbounded ranges are handled by terminating the stream early.

Let me know if you'd like further changes

---------

Co-authored-by: Kumar Ujjawal <kumar.ujjawal@greenpista.com>
Co-authored-by: Kumar Ujjawal <kumarujjawal@Kumars-MacBook-Air.local>
2025-06-21 15:57:14 -04:00
760c9ef2e9 fix(completion): invalid prefix for external path argument with spaces (#15998)
Fixes the default behavior of #15790 

# Description

As for the mentioned carapace version: `cat ~"/Downloads/Obsidian
Vault/"`, the problem lies in the unexpanded home directory `~`. Either
we encourage users to manually expand that in
`$env.config.completions.external.completer` or open an issue on the
carapace project.

# User-Facing Changes

bug fix

# Tests + Formatting

Adjusted

# After Submitting
2025-06-20 21:33:01 -04:00
c3079a14d9 feat(table): add 'double' table mode (#16013)
# Description

Add 'double' table mode, that is similar to `compact_double` but with
left and right border lines. This is similar to how there exist both
`single` and `compact`, but there is no `double` to compliment
`compact_double`. Printing `[ { a: 1, b: 11 }, { a: 2, b:12 } ]` looks
like this:

```
╔═══╦═══╦════╗
║ # ║ a ║ b  ║
╠═══╬═══╬════╣
║ 0 ║ 1 ║ 11 ║
║ 1 ║ 2 ║ 12 ║
╚═══╩═══╩════╝
```

The implementation is mostly a one-to-one of #15672 and #15681.

# User-Facing Changes

New value `double` to set as `$env.config.table.mode`.

# Tests + Formatting

Tests are added following the example of adding 'single' mode.

# After Submitting
2025-06-20 21:09:55 +02:00
4f7e9aac62 fix LS_COLORS fi=0 coloring (#16012)
# Description

fixes #16010

When `$env.LS_COLORS = 'fi=0' and `$env.config.color_config.string =
'red'` were set, regular files without file extensions would be colored
red. Now they're colored based on the LS_COLORS definition which, in
this case, means use default colors.

This is done by checking if a style was applied from ls_colors and if
none was applied, create a default nu_ansi_term style with
'Color::Default' for foreground and background.

### Before

![image](https://github.com/user-attachments/assets/ff245ee9-3299-4362-9df7-95613e8972ed)

### After

![image](https://github.com/user-attachments/assets/7c3f1178-6e6b-446d-b88c-1a5b0747345d)



# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->

---------

Co-authored-by: Bahex <Bahex@users.noreply.github.com>
2025-06-20 08:09:59 -05:00
7ee8aa78cc perf: better scalability of get_columns (#15780)
# Description

Use hashset for existence checking.
Still needs a vector collection to keep the column order for tables.

# User-Facing Changes

Should be None
2025-06-20 05:07:23 -05:00
d9d022733f feat(std/help): Add --help for external-commands (#15962)
# Description
I have just discovered the `std/help` command and that it can use `man`
or other programs for externals. Coming from windows, I don't have `man`
so what I want is just to run `external_program --help` in most cases.
This pr adds that option, if you set `$env.NU_HELPER = "--help"`, it
will run the command you passed with `--help` added as the last
argument.


![image](https://github.com/user-attachments/assets/60d25dda-718b-4cb5-b540-808de000b221)

# User-Facing Changes
None

# Tests + Formatting


# After Submitting
2025-06-20 05:06:27 -05:00
1d032ce80c Support namespaces in query xml (#16008)
Refs #15992
Refs #14457 

# Description

This PR introduces a new switch for `query xml`, `--namespaces`,
and thus allows people to use namespace prefixes in the XPath query
to query namespaced XML.

Example:
```nushell
r#'
   <rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
      <rdf:Description rdf:about=""
            xmlns:dc="http://purl.org/dc/elements/1.1/"
         <dc:title>Black-breasted buzzard_AEB_IMG_7158</dc:title>
      </rdf:Description>
   </rdf:RDF>
'# | query xml --namespaces {dublincore: "http://purl.org/dc/elements/1.1/"} "//dublincore:title/text()"
```

# User-Facing Changes

New switch added to `query xml`: `query xml --namespaces {....}`

# Tests + Formatting

Pass.

# After Submitting

IIRC the commands docs on the website are automatically generated, so
nothing to do here.
2025-06-19 17:58:26 -05:00
975a89269e repect color_config.header color for record key (#16006)
# Description

This PR fixes an oversight where the record key value was not being
colored as the color_config.header color when used with the `table`
command in some circumstances. It respected it with `table -e` but just
not `table`.

### Before

![image](https://github.com/user-attachments/assets/a41e609f-9b3a-415b-af90-037e6ee47318)

### After

![image](https://github.com/user-attachments/assets/c3afb293-ebb3-4cb3-8ee6-4f7e2e96723b)


# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2025-06-19 11:22:18 -05:00
db5b6c790f Fix: missing installed_plugins in version (#16004)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

- related #15972

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

In #15972 I was very eager removing fowarded features from `nu` to
`nu-cmd-lang`. By accident I also removed `nu-cmd-lang/plugin` too. This
removed `installed_plugins` from `version`. By adding the feature again,
it works again.
2025-06-19 07:48:20 -05:00
2bed202b82 Add backtrack named flag to parse (issue #15997) (#16000)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

Addresses #15997

Adds a `--backtrack` or `-b` named flag to the `parse` command. Allows a
user to specify a max backtrack limit for fancy-regex other than the
default 1,000,000 limit.

Uses a RegexBuilder to add the manual config.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

Adds a new named flag `backtrack` to the `parse` command. The flag is
optional and defaults to 1,000,000.

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

Added an example test to the parse command using `--backtrack 1500000`.
2025-06-19 06:42:30 -05:00
8a0f2ca9f9 Bump calamine to 0.28 (#16003) 2025-06-19 13:37:36 +02:00
24ab294cda Use CARGO_CFG_FEATURE to get feature list in version (#15972) 2025-06-19 12:58:37 +02:00
bfa95bbd24 Clearer help section for command attributes (#15999)
Refs
https://discord.com/channels/601130461678272522/615329862395101194/1385021428314800148

# Description

Clearer command attribute section (`This command:`).


![image](https://github.com/user-attachments/assets/7f26c015-1f00-4a86-a334-c87f7756ee82)


# User-Facing Changes

This is a cosmetic change to how `std/help` shows the command
attributes.

# Tests + Formatting

Pass.
2025-06-18 20:54:50 -05:00
3f700f03ad Generalize nu_protocol::format_shell_error (#15996) 2025-06-18 22:16:01 +02:00
f0e90a3733 Move nu_command::platform::ansi to nu_command::strings::ansi (#15995) 2025-06-18 21:51:16 +02:00
cde8a629c5 Restrict config.show_banner to valid options (#15985) 2025-06-18 10:49:40 +02:00
70aa7ad993 Disallow clippy::used_underscore_binding lint (#15988) 2025-06-18 10:19:57 +02:00
29b3512494 build(deps): bump shadow-rs from 1.1.1 to 1.2.0 (#15989)
Bumps [shadow-rs](https://github.com/baoyachi/shadow-rs) from 1.1.1 to
1.2.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/baoyachi/shadow-rs/releases">shadow-rs's
releases</a>.</em></p>
<blockquote>
<h2>v1.2.0</h2>
<h2>What's Changed</h2>
<ul>
<li>add cargo_metadata crate unit test by <a
href="https://github.com/baoyachi"><code>@​baoyachi</code></a> in <a
href="https://redirect.github.com/baoyachi/shadow-rs/pull/231">baoyachi/shadow-rs#231</a></li>
<li>Update cargo_metadata requirement from 0.19.1 to 0.20.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/baoyachi/shadow-rs/pull/229">baoyachi/shadow-rs#229</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/baoyachi/shadow-rs/compare/v1.1.1...v1.2.0">https://github.com/baoyachi/shadow-rs/compare/v1.1.1...v1.2.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f0d180ac92"><code>f0d180a</code></a>
Update Cargo.toml</li>
<li><a
href="d106a172ad"><code>d106a17</code></a>
Merge pull request <a
href="https://redirect.github.com/baoyachi/shadow-rs/issues/229">#229</a>
from baoyachi/dependabot/cargo/cargo_metadata-0.20.0</li>
<li><a
href="7861af1dd0"><code>7861af1</code></a>
Merge branch 'master' into dependabot/cargo/cargo_metadata-0.20.0</li>
<li><a
href="ab73c01cd1"><code>ab73c01</code></a>
Merge pull request <a
href="https://redirect.github.com/baoyachi/shadow-rs/issues/231">#231</a>
from baoyachi/cargo_metadata</li>
<li><a
href="ff1a1dcf27"><code>ff1a1dc</code></a>
fix cargo clippy check</li>
<li><a
href="f59bceaf92"><code>f59bcea</code></a>
add cargo_metadata crate unit test</li>
<li><a
href="5c5b556400"><code>5c5b556</code></a>
Update cargo_metadata requirement from 0.19.1 to 0.20.0</li>
<li>See full diff in <a
href="https://github.com/baoyachi/shadow-rs/compare/v1.1.1...v1.2.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=shadow-rs&package-manager=cargo&previous-version=1.1.1&new-version=1.2.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-18 11:34:35 +08:00
d961ea19cc Add automatic reminder for doc_config.nu (#15984)
Inspired by https://github.com/nushell/nushell/pull/15979 a small Github
actions bot that detects when you make a change to the `nu-protocol`
bits of the config and reminds to consider making a change to the
Nushell version in `doc_config.nu` as well.
2025-06-16 23:51:15 +02:00
3db9c81958 Search nested structures recursively in find command (#15850)
# Description

Instead of converting nested structures into strings and
pattern-matching the strings, the `find` command will recursively search
the nested structures for matches.

- fixes #15618 

# User-Facing Changes

Text in nested structures will now be highlighted as well.

Error values will always passed on instead of testing them against the
search term

There will be slight changes in match behavior, such as characters that
are part of the string representations of data structures no longer
matching all nested data structures.
2025-06-16 15:29:41 -05:00
55240d98a5 Update config nu --doc to represent OSC 7 and 9;9 better (#15979)
- fixes #15975

# Description

This changes the `config nu --doc` output for OSC 7 and 9;9 to represent
better what happens on Windows machines.

This is the current behavior internally:

5be8717fe8/crates/nu-protocol/src/config/shell_integration.rs (L18-L27)

And with this PR the `config nu --doc` better reflects that behavior,
thanks to @fdncred for that idea.

# User-Facing Changes

None

# Tests + Formatting


- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting


---------

Co-authored-by: Bahex <Bahex@users.noreply.github.com>
2025-06-16 21:42:07 +02:00
fda181d566 Adjust std-rfc/clip deprecation window (#15981)
Follow-up to #15877. That PR was created before 0.105, but merged after
it was released. This PR adjusts the deprecation window from
0.105.0-0.107.0 to 0.106.0-0.108.0
2025-06-16 21:40:37 +02:00
2e484156e0 Use internal find.rs code for help --find (#15982)
# Description

Currently, `help --find` uses it's own code for looking for the keyword
in a string and highlighting it. This code duplicates a lot of the logic
found in the code of `find`.

This commit re-uses the code of `find` in `help` commands instead.

# User-Facing Changes

This should not affect the behavior of `help`.
2025-06-16 21:29:32 +02:00
52604f8b00 fix(std/log): Don't assume env variables are set (#15980)
# Description
Commands in `std/log` assume the `export-env` has been run and the
relevant environment variables are set.
However, when modules/libraries import `std/log` without defining their
own `export-env` block to run `std/log`'s, logging commands will fail at
runtime.

While it's on the author of the modules to include `export-env { use
std/log [] }` in their modules, this is a very simple issue to solve and
would make the user experience smoother.

# User-Facing Changes
`std/log` work without problem when their env vars are not set.

---------

Co-authored-by: Bahex <17417311+Bahex@users.noreply.github.com>
Co-authored-by: 132ikl <132@ikl.sh>
2025-06-16 12:31:17 -04:00
2fed1f5967 Update Nu for release and nightly workflow (#15969)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

1. Upgrade Nushell to 0.105.1 for release and nightly workflow
2. Use `hustcer/setup-nu` Action for `windows-11-arm` runner to simplify
the workflow

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

None

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

Looks fine here:
https://github.com/hustcer/nushell/actions/runs/15657383788/job/44110173668#step:7:1357
2025-06-15 22:25:18 +08:00
5be8717fe8 Add full feature as an alternative to --all-features (#15971)
- closes #15967 

# Description


In 0.105 we introduced the feature `rustls-tls` which is enabled by
default and uses `rustls` instead of `openssl` on linux machines. Since
both `native-tls` and `rustls-tls` cannot be enabled at the same did
this break the `--all-features` flag. To provide an easy alternative, I
introduced the `full` feature here.

# User-Facing Changes

Instead of `cargo install nu --all-features`, you now can do `cargo
install nu --features full`.

# Tests + Formatting


No new tests, this is just a feature collection.
2025-06-15 13:49:58 +02:00
091d14f085 Fix table --expand case with wrapping of emojie (#15948)
close #15940
2025-06-14 18:39:39 -05:00
4c19242c0d feat(gstat): add state entry (like "Clean", "Merge", "Rebase", etc.) (#15965)
# Description

This PR adds a new `state` key to the output of `gstat` that shows the
current repo state state. Like "Clean", "Merge", "Rebase", etc. The full
list of possible values can be seen
[here](https://docs.rs/git2/latest/git2/enum.RepositoryState.html).

This information is somewhat useful when shown in prompt. Not often
needed, but sometimes really useful.

# User-Facing Changes

New key added to `gstat` output. I don't think it should cause issues to
`gstat` users.

# Tests + Formatting

I couldn't find any tests for `nu_plugin_gstat`.

# After Submitting

I couldn't find any documentation about the output of `gstat`, so I
don't think there is anything to be done here either.
2025-06-14 07:50:29 -05:00
3df0177ba5 feat: use get request by default, post if payload (#15862)
Hello, this PR resolves the second request of the issue
https://github.com/nushell/nushell/issues/10957, which involves using a
default verb based on the request. If a URL is provided, the command
will default to GET, and if data is provided, it will default to POST.
This means that the following pairs of commands are equivalent:

```
http --content-type application/json http://localhost:8000 {a:1}
http post --content-type application/json http://localhost:8000 {a:1}
```
```
http http://localhost:8000 "username"
http post http://localhost:8000 "username"
```
```
http http://localhost:8000
http get http://localhost:8000
```

The `http` command now accepts all flags of the `post` and `get`
commands. It will still display the help message if no subcommand is
provided, and the description has been updated accordingly. The logic in
the `http` command is minimal to delegate error management
responsibilities to the specific `run_get` and `run_post` functions.
2025-06-14 15:22:37 +08:00
f7888fce83 fix stor insert/delete collision (#15838)
# Description

Based on some testing in
[Discord](https://discord.com/channels/601130461678272522/1349836000804995196/1353138803640111135)
we were able to find that `insert` and `delete` happening at the same
time caused problems in the `stor` command. So, I added `conn.is_busy()`
with a sleep to try and avoid that problem.


![image](https://github.com/user-attachments/assets/e01bccab-0aaa-40ab-b0bf-25e3c72aa037)

/cc @NotTheDr01ds @132ikl 

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2025-06-13 16:29:07 -05:00
cf1a53143c feat(ansi): use _ in short name and rst -> reset (#15907)
# Description
I've noticed that unlike everything else in nushell the output of `ansi
--list` has a column named `short name` instead of `short_name`, so I
changed it. While I was at it, I also added a shortname `rst` to `reset`
since it is often used.

# User-Facing Changes
Changed the column name of `ansi --list` from `short name` to
`short_name`
2025-06-13 16:24:40 -05:00
28a94048c5 feat(format number): add --no-prefix flag (#15960)
# Description
I have added a `--no-prefix` flag to the `format number` command to not
include the `0b`, `0x` and `0o` prefixes in the output. Also, I've
changed the order in which the formats are displayed to one I thinks
makes it easier to read, with the upper and lower alternatives next to
each other.


![image](https://github.com/user-attachments/assets/cd50631d-1b27-40d4-84d9-f2ac125586d4)

# User-Facing Changes
The formatting of floats previously did not include prefixes while
integers did. Now prefixes are on by default for both, while including
the new flag removes them. Changing the order of the record shouldn't
have any effect on previous code.

# Tests + Formatting
I have added an additional example that test this behavior.

# After Submitting

---------

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
2025-06-13 16:13:26 -05:00
fb691c0da5 Allow polars schema --datatype-list to be used without pipeline input (#15964)
# Description
Fixes the issue of listing allowed datatypes when not being used with
dataframe pipeline input.

Co-authored-by: Jack Wright <jack.wright@nike.com>
2025-06-13 12:38:50 -07:00
7972aea530 Make polars last consistent with polars first (#15963)
# Description
`polars last` will only return one row by default making it consistent
with `polars first`

# User-Facing Changes
- `polars last` will only return one row by default making it consistent
with `polars first`

Co-authored-by: Jack Wright <jack.wright@nike.com>
2025-06-13 12:35:26 -07:00
aa710eeb9a Add groupby support for polars last (#15953)
# Description
Allows `polars last` to be used with group-by
```nu
> ❯ : [[a b c d]; [1 0.5 true Apple] [2 0.5 true Orange] [2 4 true Apple] [3 10 false Apple] [4 13 false Banana] [5 14 true Banana]] | polars into-df -s {a: u8, b: f32, c: bool, d: str} | polars group-by d | polars last | polars sort-by [a] | polars collect
╭───┬────────┬───┬───────┬───────╮
│ # │   d    │ a │   b   │   c   │
├───┼────────┼───┼───────┼───────┤
│ 0 │ Orange │ 2 │  0.50 │ true  │
│ 1 │ Apple  │ 3 │ 10.00 │ false │
│ 2 │ Banana │ 5 │ 14.00 │ true  │
╰───┴────────┴───┴───────┴───────╯
```

# User-Facing Changes
- `polars last` can now be used with group-by expressions

Co-authored-by: Jack Wright <jack.wright@nike.com>
2025-06-13 12:10:29 -07:00
91e843a6d4 add like, not-like to help operators (#15959)
# Description

This PR adds `like` and `not-like` to the `help operators` command. Now
it at least lists them. I wasn't sure if I should say `=~ or like` so I
just separated them with a comma.

![image](https://github.com/user-attachments/assets/1165d900-80a2-4633-9b75-109fcb617c75)


# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2025-06-13 08:59:37 -05:00
ebcb26f9d5 Promote clip from std-rfc to std (#15877)
# Description
Promotes the clip module from `std-rfc` to `std`. Whether we want to
promote other modules as well (probably?) is up for discussion but I
thought I would get the ball rolling with this one.

# User-Facing Changes
* The `clip` module has been promoted from `std-rfc` to `std`. Using the
`std-rfc` version of clip modules will give a deprecation warning
instructing you to switch to the `std` version.

# Tests + Formatting
N/A

# After Submitting
N/A
2025-06-13 07:26:48 +08:00
f8b0af70ff Don't make unquoted file/dir paths absolute (#15878)
# Description

Closes #15848. Currently, we expand unquoted strings to absolute paths
if they are of type `path` or `directory`. This PR makes this no longer
happen. `~`, `.`, and `..+` are still expanded, but a path like
`.../foo/bar/..` will only be turned into `../../foo`, rather than a
full absolute path.

This is mostly so that paths don't get modified before being sent to
known external commands (as in the linked issue). But also, it seems
unnecessary to make all unquoted paths absolute.

After feedback from @132ikl, this PR also makes it so that unquoted
paths are expanded at parse time, so that it matches the runtime
behavior. Previously, `path` expressions were turned into strings
verbatim, while `directory` expressions were treated as not being const.

API change: `nu_path::expansions::expand_path` is now exposed as
`nu_path::expand_path`.

# User-Facing Changes

This has the potential to silently break a lot of scripts. For example,
if someone has a command that expects an already-expanded absolute path,
changes the current working directory, and then passes the path
somewhere, they will now need to use `path expand` to expand the path
themselves before changing the current working directory.

# Tests + Formatting

Just added one test to make sure unquoted `path` arguments aren't made
absolute.

# After Submitting

This is a breaking change, so will need to be mentioned in the release
notes.
2025-06-13 07:26:01 +08:00
12465193a4 cli: Use latest specified flag value when repeated (#15919)
# Description

This PR makes the last specified CLI arguments take precedence over the
earlier ones.

Existing command line tools that align with the new behaviour include:
- `neovim`: `nvim -u a.lua -u b.lua` will use `b.lua`
- `ripgrep`: you can have `--smart-case` in your user config but
override it later with `--case-sensitive` or `--ignore-case` (not
exactly the same flag override as the one I'm talking about but I think
it's still a valid example of latter flags taking precedence over the
first ones)

I think a flag defined last can be considered an override. This allows
having a `nu` alias that includes some default config (`alias nu="nu
--config something.nu"`) but being able to override that default config
as if using `nu` normally.
 
## Example

```sh
nu --config config1.nu --config config2.nu -c '$nu.config-path'
```
The current behavior would print `config1.nu`, and the new one would
print `config2.nu`

## Implementation

Just `.rev()` the iterator to search for arguments starting from the end
of the list. To support that I had to modify the return type of
`named_iter` (I couldn't find a more generic way than
`DoubleEndedIterator`).

# User-Facing Changes

- Users passing repeated flags and relying in nushell using the first
value will experience breakage. Given that right now there's no point in
passing a flag multiple times I guess not many users will be affected

# Tests + Formatting

I added a test that checks the new behavior with `--config` and
`--env-config`. I'm happy to add more cases if needed

# After Submitting
2025-06-13 07:23:38 +08:00
bd3930d00d Better error on spawn failure caused by null bytes (#15911)
# Description

When attempting to pass a null byte in a commandline argument, Nu
currently fails with:

```
> ^echo (char -i 0)
Error: nu:🐚:io::invalid_input

  × I/O error
  ╰─▶   × Could not spawn foreground child

   ╭────
 1 │ crates/nu-command/src/system/run_external.rs:284:17
   · ─────────────────────────┬─────────────────────────
   ·                          ╰── Invalid input parameter
   ╰────
```

This does not explain which input parameter is invalid, or why. Since Nu
does not typically seem to escape null bytes when printing values
containing them, this can make it a bit tricky to track down the
problem.

After this change, it fails with:

```
> ^echo (char -i 0)
Error: nu:🐚:io::invalid_input

  × I/O error
  ╰─▶   × Could not spawn foreground child: nul byte found in provided data

   ╭────
 1 │ crates/nu-command/src/system/run_external.rs:282:17
   · ─────────────────────────┬─────────────────────────
   ·                          ╰── Invalid input parameter
   ╰────

```

which is more useful. This could be improved further but this is niche
enough that is probably not necessary.

This might make some other errors unnecessarily verbose but seems like
the better default. I did check that attempting to execute a
non-executable file still has a reasonable error: the error message for
that failure is not affected by this change.

It is still an "internal" error (referencing the Nu code triggering it,
not the user's input) because the `call.head` span available to this
code is for the command, not its arguments. Using it would result in

```
  × I/O error
  ╰─▶   × Could not spawn foreground child: nul byte found in provided data

   ╭─[entry #1:1:2]
 1 │ ^echo (char -i 0)
   ·  ──┬─
   ·    ╰── Invalid input parameter
   ╰────
```

which is actively misleading because "echo" does not contain the nul
byte.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

Haven't tried to write a test yet: it's tricky because the better error
message comes from the Rust stdlib (so a straightforward integration
test checking for the specific message would be brittle)...

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2025-06-13 07:22:37 +08:00
81e86c40e1 Try to make hide-env respects overlays (#15904)
# Description
Closes: #15755
I think it's a good feature, to achieve this, we need to get all hidden
envs(it's defined in `get_hidden_env_vars`, and then restore these envs
back to stack)

# User-Facing Changes
### Before
```nushell
> $env.foo = 'bar'
> overlay new xxx
> hide-env foo
> overlay hide xxx
> $env.foo
Error: nu:🐚:column_not_found

  × Cannot find column 'foo'
   ╭─[entry #21:5:1]
 4 │ overlay hide xxx
 5 │ $env.foo
   · ────┬───┬
   ·     │   ╰── value originates here
   ·     ╰── cannot find column 'foo'
   ╰────
```

### After
```nushell
> $env.foo = 'bar'
> overlay new xxx
> hide-env foo
> overlay hide xxx
> $env.foo
bar
```

## Note
But it doesn't work if it runs the example code in script:
`nu -c "$env.foo = 'bar'; overlay new xxx; hide-env foo; overlay hide
xxx; $env.foo"`
still raises an error says `foo` doesn't found. That's because if we run
the script at once, the envs in stack doesn't have a chance to merge
back into `engine_state`, which is only called in `repl`.

It introduces some sort of inconsistency, but I think users use overlays
mostly in repl, so it's good to have such feature first.

# Tests + Formatting
Added 2 tests

# After Submitting
NaN
2025-06-13 07:22:23 +08:00
2fe25d6299 nu-table: (table -e) Reuse NuRecordsValue::width in some cases (#15902)
Just remove a few calculations of width for values which will be
inserted anyhow.
So it must be just a bit faster (in base case).
2025-06-13 07:22:10 +08:00
4aeede2dd5 nu-table: Remove safety-net width check (#15901)
I think we must be relatively confident to say at the check point we
build correct table.
There must be no point endlessly recheck stuff.
2025-06-13 07:22:02 +08:00
0e46ef9769 build(deps): bump which from 7.0.0 to 7.0.3 (#15937)
Bumps [which](https://github.com/harryfei/which-rs) from 7.0.0 to 7.0.3.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/harryfei/which-rs/releases">which's
releases</a>.</em></p>
<blockquote>
<h2>7.0.3</h2>
<ul>
<li>Update rustix to version 1.0. Congrats to rustix on this milestone,
and thanks <a href="https://github.com/mhils"><code>@​mhils</code></a>
for this contribution to which!</li>
</ul>
<h2>7.0.2</h2>
<ul>
<li>Don't return paths containing the single dot <code>.</code>
reference to the current directory, even if the original request was
given in terms of the current directory. Thanks <a
href="https://github.com/jakobhellermann"><code>@​jakobhellermann</code></a>
for this contribution!</li>
</ul>
<h2>7.0.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Switch to <code>env_home</code> crate by <a
href="https://github.com/micolous"><code>@​micolous</code></a> in <a
href="https://redirect.github.com/harryfei/which-rs/pull/105">harryfei/which-rs#105</a></li>
<li>fixes <a
href="https://redirect.github.com/harryfei/which-rs/issues/106">#106</a>,
bump patch version by <a
href="https://github.com/Xaeroxe"><code>@​Xaeroxe</code></a> in <a
href="https://redirect.github.com/harryfei/which-rs/pull/107">harryfei/which-rs#107</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/micolous"><code>@​micolous</code></a>
made their first contribution in <a
href="https://redirect.github.com/harryfei/which-rs/pull/105">harryfei/which-rs#105</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/harryfei/which-rs/compare/7.0.0...7.0.1">https://github.com/harryfei/which-rs/compare/7.0.0...7.0.1</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/harryfei/which-rs/blob/master/CHANGELOG.md">which's
changelog</a>.</em></p>
<blockquote>
<h2>7.0.3</h2>
<ul>
<li>Update rustix to version 1.0. Congrats to rustix on this milestone,
and thanks <a href="https://github.com/mhils"><code>@​mhils</code></a>
for this contribution to which!</li>
</ul>
<h2>7.0.2</h2>
<ul>
<li>Don't return paths containing the single dot <code>.</code>
reference to the current directory, even if the original request was
given in
terms of the current directory. Thanks <a
href="https://github.com/jakobhellermann"><code>@​jakobhellermann</code></a>
for this contribution!</li>
</ul>
<h2>7.0.1</h2>
<ul>
<li>Get user home directory from <code>env_home</code> instead of
<code>home</code>. Thanks <a
href="https://github.com/micolous"><code>@​micolous</code></a> for this
contribution!</li>
<li>If home directory is unavailable, do not expand the tilde to an
empty string. Leave it as is.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1d145deef8"><code>1d145de</code></a>
release version 7.0.3</li>
<li><a
href="f5e5292234"><code>f5e5292</code></a>
fix unrelated lint error</li>
<li><a
href="4dcefa6fe9"><code>4dcefa6</code></a>
bump rustix</li>
<li><a
href="bd868818bd"><code>bd86881</code></a>
bump version, add to changelog</li>
<li><a
href="cf37760ea1"><code>cf37760</code></a>
don't run relative dot test on macos</li>
<li><a
href="f2c4bd6e8b"><code>f2c4bd6</code></a>
update target to new name for wasm32-wasip1</li>
<li><a
href="87acc088c1"><code>87acc08</code></a>
When searching for <code>./script.sh</code>, don't return
<code>/path/to/./script.sh</code></li>
<li><a
href="68acf2c456"><code>68acf2c</code></a>
Fix changelog to link to GitHub profile</li>
<li><a
href="b6754b2a56"><code>b6754b2</code></a>
Update CHANGELOG.md</li>
<li><a
href="0c63719129"><code>0c63719</code></a>
fixes <a
href="https://redirect.github.com/harryfei/which-rs/issues/106">#106</a>,
bump patch version</li>
<li>Additional commits viewable in <a
href="https://github.com/harryfei/which-rs/compare/7.0.0...7.0.3">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=which&package-manager=cargo&previous-version=7.0.0&new-version=7.0.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-13 07:21:37 +08:00
962467fdfd build(deps): bump titlecase from 3.5.0 to 3.6.0 (#15936)
Bumps [titlecase](https://github.com/wezm/titlecase) from 3.5.0 to
3.6.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/wezm/titlecase/releases">titlecase's
releases</a>.</em></p>
<blockquote>
<h2>Version 3.6.0</h2>
<ul>
<li>Support hyphenated words by <a
href="https://github.com/carlocorradini"><code>@​carlocorradini</code></a>
in <a
href="https://redirect.github.com/wezm/titlecase/pull/37">#37</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/wezm/titlecase/compare/v3.5.0...v3.6.0">https://github.com/wezm/titlecase/compare/v3.5.0...v3.6.0</a></p>
<h2>Binaries</h2>
<ul>
<li><a
href="https://releases.wezm.net/titlecase/v3.6.0/titlecase-v3.6.0-amd64-unknown-freebsd.tar.gz">FreeBSD
13+ amd64</a></li>
<li><a
href="https://releases.wezm.net/titlecase/v3.6.0/titlecase-v3.6.0-x86_64-unknown-linux-musl.tar.gz">Linux
x86_64</a></li>
<li><a
href="https://releases.wezm.net/titlecase/v3.6.0/titlecase-v3.6.0-universal-apple-darwin.tar.gz">MacOS
Universal</a></li>
<li><a
href="https://releases.wezm.net/titlecase/v3.6.0/titlecase-v3.6.0-x86_64-pc-windows-msvc.zip">Windows
x86_64</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/wezm/titlecase/blob/master/Changelog.md">titlecase's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/wezm/titlecase/releases/tag/v3.6.0">3.6.0</a></h2>
<ul>
<li>Support hypendated words <a
href="https://redirect.github.com/wezm/titlecase/pull/37">#37</a>.
Thanks <a
href="https://github.com/carlocorradini"><code>@​carlocorradini</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="83265b43ba"><code>83265b4</code></a>
Version 3.6.0</li>
<li><a
href="e49b32b262"><code>e49b32b</code></a>
Use 'contains' to check for internal characters</li>
<li><a
href="736be39991"><code>736be39</code></a>
feat: hyphen</li>
<li>See full diff in <a
href="https://github.com/wezm/titlecase/compare/v3.5.0...v3.6.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=titlecase&package-manager=cargo&previous-version=3.5.0&new-version=3.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-13 07:21:09 +08:00
d27232df6e build(deps): bump ansi-str from 0.8.0 to 0.9.0 (#15935)
Bumps [ansi-str](https://github.com/zhiburt/ansi-str) from 0.8.0 to
0.9.0.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/zhiburt/ansi-str/commits">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ansi-str&package-manager=cargo&previous-version=0.8.0&new-version=0.9.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-13 07:19:47 +08:00
d7cec2088a Fix docs typo referring to non-existant Value::CustomValue (#15954)
# Description

I was messing around with custom types and noticed `nu-protocol`
referring to a `Value::CustomValue` variant that doesn't exist. Fixed it
to say `Value::Custom` instead.

# User-Facing Changes

Documentation mentions the correct variant of `Value`

# Tests + Formatting

No new tests necessary

# After Submitting
2025-06-12 13:34:52 -05:00
22d1fdcdf6 Improve precision in parsing of filesize values (#15950)
- improves rounding error reported in #15851
- my ~~discussion~~ monologue on how filesizes are parsed currently:
#15944

# Description

The issue linked above reported rounding errors when converting MiB to
GiB, which is mainly caused by parsing of the literal.

Nushell tries to convert all filesize values to bytes, but currently
does so in 2 steps:
- first converting it to the next smaller unit in `nu-parser` (so `MiB`
to `KiB`, in this case), and truncating to an `i64` here
- then converting that to bytes in `nu-engine`, again truncating to
`i64`

In the specific example above (`95307.27MiB`), this causes 419 bytes of
rounding error. By instead directly converting to bytes while parsing,
the value is accurate (truncating those 0.52 bytes, or 4.12 bits).
Rounding error in the conversion to GiB is also multiple magnitudes
lower.

(Note that I haven't thoroughly tested this, so I can't say with
confidence that all values would be parsed accurate to the byte.)

# User-Facing Changes

More accurate filesize values, and lower accumulated rounding error in
calculations.

# Tests + Formatting

new test: `parse_filesize` in `nu-parser` - verifies that `95307.27MiB`
is parsed correctly as `99_936_915_947B`

# After Submitting
2025-06-12 07:58:21 -05:00
ba59f71f20 bump to dev version 0.105.2 (#15952)
# Description

Bump nushell to development version.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2025-06-12 07:57:01 -05:00
2352548467 Use NUSHELL_PAT for winget publish (#15934)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Use NUSHELL_PAT for winget publish
2025-06-11 06:48:54 +08:00
3efbda63b8 Try to fix winget publish error (#15933)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

Try to fix winget publish error here:
https://github.com/nushell/nushell/actions/runs/15567694761/job/43835718254#step:2:208
2025-06-11 05:27:27 +08:00
282 changed files with 3339 additions and 2255 deletions

View File

@ -0,0 +1,25 @@
name: Comment on changes to the config
on:
pull_request_target:
paths:
- 'crates/nu-protocol/src/config/**'
jobs:
comment:
runs-on: ubuntu-latest
steps:
- name: Check if there is already a bot comment
uses: peter-evans/find-comment@v3
id: fc
with:
issue-number: ${{ github.event.pull_request.number }}
comment-author: 'github-actions[bot]'
body-includes: Hey, just a bot checking in!
- name: Create comment if there is not
if: steps.fc.outputs.comment-id == ''
uses: peter-evans/create-or-update-comment@v4
with:
issue-number: ${{ github.event.pull_request.number }}
body: |
Hey, just a bot checking in! You edited files related to the configuration.
If you changed any of the default values or added a new config option, don't forget to update the [`doc_config.nu`](https://github.com/nushell/nushell/blob/main/crates/nu-utils/src/default_files/doc_config.nu) which documents the options for our users including the defaults provided by the Rust implementation.
If you didn't make a change here, you can just ignore me.

View File

@ -46,7 +46,7 @@ jobs:
uses: hustcer/setup-nu@v3 uses: hustcer/setup-nu@v3
if: github.repository == 'nushell/nightly' if: github.repository == 'nushell/nightly'
with: with:
version: 0.103.0 version: 0.105.1
# Synchronize the main branch of nightly repo with the main branch of Nushell official repo # Synchronize the main branch of nightly repo with the main branch of Nushell official repo
- name: Prepare for Nightly Release - name: Prepare for Nightly Release
@ -127,6 +127,7 @@ jobs:
- armv7-unknown-linux-musleabihf - armv7-unknown-linux-musleabihf
- riscv64gc-unknown-linux-gnu - riscv64gc-unknown-linux-gnu
- loongarch64-unknown-linux-gnu - loongarch64-unknown-linux-gnu
- loongarch64-unknown-linux-musl
include: include:
- target: aarch64-apple-darwin - target: aarch64-apple-darwin
os: macos-latest os: macos-latest
@ -152,6 +153,8 @@ jobs:
os: ubuntu-22.04 os: ubuntu-22.04
- target: loongarch64-unknown-linux-gnu - target: loongarch64-unknown-linux-gnu
os: ubuntu-22.04 os: ubuntu-22.04
- target: loongarch64-unknown-linux-musl
os: ubuntu-22.04
runs-on: ${{matrix.os}} runs-on: ${{matrix.os}}
steps: steps:
@ -179,36 +182,22 @@ jobs:
uses: actions-rust-lang/setup-rust-toolchain@v1 uses: actions-rust-lang/setup-rust-toolchain@v1
# WARN: Keep the rustflags to prevent from the winget submission error: `CAQuietExec: Error 0xc0000135` # WARN: Keep the rustflags to prevent from the winget submission error: `CAQuietExec: Error 0xc0000135`
with: with:
cache: false
rustflags: '' rustflags: ''
- name: Setup Nushell - name: Setup Nushell
uses: hustcer/setup-nu@v3 uses: hustcer/setup-nu@v3
if: ${{ matrix.os != 'windows-11-arm' }}
with: with:
version: 0.103.0 version: 0.105.1
- name: Release Nu Binary - name: Release Nu Binary
id: nu id: nu
if: ${{ matrix.os != 'windows-11-arm' }}
run: nu .github/workflows/release-pkg.nu run: nu .github/workflows/release-pkg.nu
env: env:
OS: ${{ matrix.os }} OS: ${{ matrix.os }}
REF: ${{ github.ref }} REF: ${{ github.ref }}
TARGET: ${{ matrix.target }} TARGET: ${{ matrix.target }}
- name: Build Nu for Windows ARM64
id: nu0
shell: pwsh
if: ${{ matrix.os == 'windows-11-arm' }}
run: |
$env:OS = 'windows'
$env:REF = '${{ github.ref }}'
$env:TARGET = '${{ matrix.target }}'
cargo build --release --all --target aarch64-pc-windows-msvc
cp ./target/${{ matrix.target }}/release/nu.exe .
./nu.exe -c 'version'
./nu.exe ${{github.workspace}}/.github/workflows/release-pkg.nu
- name: Create an Issue for Release Failure - name: Create an Issue for Release Failure
if: ${{ failure() }} if: ${{ failure() }}
uses: JasonEtco/create-an-issue@v2 uses: JasonEtco/create-an-issue@v2
@ -228,9 +217,7 @@ jobs:
prerelease: true prerelease: true
files: | files: |
${{ steps.nu.outputs.msi }} ${{ steps.nu.outputs.msi }}
${{ steps.nu0.outputs.msi }}
${{ steps.nu.outputs.archive }} ${{ steps.nu.outputs.archive }}
${{ steps.nu0.outputs.archive }}
tag_name: ${{ needs.prepare.outputs.nightly_tag }} tag_name: ${{ needs.prepare.outputs.nightly_tag }}
name: ${{ needs.prepare.outputs.build_date }}-${{ needs.prepare.outputs.nightly_tag }} name: ${{ needs.prepare.outputs.build_date }}-${{ needs.prepare.outputs.nightly_tag }}
env: env:
@ -276,7 +263,7 @@ jobs:
- name: Setup Nushell - name: Setup Nushell
uses: hustcer/setup-nu@v3 uses: hustcer/setup-nu@v3
with: with:
version: 0.103.0 version: 0.105.1
# Keep the last a few releases # Keep the last a few releases
- name: Delete Older Releases - name: Delete Older Releases

View File

@ -58,7 +58,7 @@ jobs:
- name: Setup Nushell - name: Setup Nushell
uses: hustcer/setup-nu@v3 uses: hustcer/setup-nu@v3
with: with:
version: nightly version: 0.105.1
- name: Release MSI Packages - name: Release MSI Packages
id: nu id: nu

View File

@ -99,6 +99,14 @@ if $os in ['macos-latest'] or $USE_UBUNTU {
$env.CARGO_TARGET_LOONGARCH64_UNKNOWN_LINUX_GNU_LINKER = 'loongarch64-unknown-linux-gnu-gcc' $env.CARGO_TARGET_LOONGARCH64_UNKNOWN_LINUX_GNU_LINKER = 'loongarch64-unknown-linux-gnu-gcc'
cargo-build-nu cargo-build-nu
} }
'loongarch64-unknown-linux-musl' => {
print $"(ansi g)Downloading LoongArch64 musl cross-compilation toolchain...(ansi reset)"
aria2c -q https://github.com/LoongsonLab/oscomp-toolchains-for-oskernel/releases/download/loongarch64-linux-musl-cross-gcc-13.2.0/loongarch64-linux-musl-cross.tgz
tar -xf loongarch64-linux-musl-cross.tgz
$env.PATH = ($env.PATH | split row (char esep) | prepend $'($env.PWD)/loongarch64-linux-musl-cross/bin')
$env.CARGO_TARGET_LOONGARCH64_UNKNOWN_LINUX_MUSL_LINKER = "loongarch64-linux-musl-gcc"
cargo-build-nu
}
_ => { _ => {
# musl-tools to fix 'Failed to find tool. Is `musl-gcc` installed?' # musl-tools to fix 'Failed to find tool. Is `musl-gcc` installed?'
# Actually just for x86_64-unknown-linux-musl target # Actually just for x86_64-unknown-linux-musl target

View File

@ -35,6 +35,7 @@ jobs:
- armv7-unknown-linux-musleabihf - armv7-unknown-linux-musleabihf
- riscv64gc-unknown-linux-gnu - riscv64gc-unknown-linux-gnu
- loongarch64-unknown-linux-gnu - loongarch64-unknown-linux-gnu
- loongarch64-unknown-linux-musl
include: include:
- target: aarch64-apple-darwin - target: aarch64-apple-darwin
os: macos-latest os: macos-latest
@ -60,6 +61,8 @@ jobs:
os: ubuntu-22.04 os: ubuntu-22.04
- target: loongarch64-unknown-linux-gnu - target: loongarch64-unknown-linux-gnu
os: ubuntu-22.04 os: ubuntu-22.04
- target: loongarch64-unknown-linux-musl
os: ubuntu-22.04
runs-on: ${{matrix.os}} runs-on: ${{matrix.os}}
@ -90,32 +93,17 @@ jobs:
- name: Setup Nushell - name: Setup Nushell
uses: hustcer/setup-nu@v3 uses: hustcer/setup-nu@v3
if: ${{ matrix.os != 'windows-11-arm' }}
with: with:
version: 0.103.0 version: 0.105.1
- name: Release Nu Binary - name: Release Nu Binary
id: nu id: nu
if: ${{ matrix.os != 'windows-11-arm' }}
run: nu .github/workflows/release-pkg.nu run: nu .github/workflows/release-pkg.nu
env: env:
OS: ${{ matrix.os }} OS: ${{ matrix.os }}
REF: ${{ github.ref }} REF: ${{ github.ref }}
TARGET: ${{ matrix.target }} TARGET: ${{ matrix.target }}
- name: Build Nu for Windows ARM64
id: nu0
shell: pwsh
if: ${{ matrix.os == 'windows-11-arm' }}
run: |
$env:OS = 'windows'
$env:REF = '${{ github.ref }}'
$env:TARGET = '${{ matrix.target }}'
cargo build --release --all --target aarch64-pc-windows-msvc
cp ./target/${{ matrix.target }}/release/nu.exe .
./nu.exe -c 'version'
./nu.exe ${{github.workspace}}/.github/workflows/release-pkg.nu
# WARN: Don't upgrade this action due to the release per asset issue. # WARN: Don't upgrade this action due to the release per asset issue.
# See: https://github.com/softprops/action-gh-release/issues/445 # See: https://github.com/softprops/action-gh-release/issues/445
- name: Publish Archive - name: Publish Archive
@ -125,9 +113,7 @@ jobs:
draft: true draft: true
files: | files: |
${{ steps.nu.outputs.msi }} ${{ steps.nu.outputs.msi }}
${{ steps.nu0.outputs.msi }}
${{ steps.nu.outputs.archive }} ${{ steps.nu.outputs.archive }}
${{ steps.nu0.outputs.archive }}
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@ -10,6 +10,11 @@ on:
required: true required: true
type: string type: string
permissions:
contents: write
packages: write
pull-requests: write
jobs: jobs:
winget: winget:
@ -25,5 +30,5 @@ jobs:
installers-regex: 'msvc\.msi$' installers-regex: 'msvc\.msi$'
version: ${{ inputs.tag_name || github.event.release.tag_name }} version: ${{ inputs.tag_name || github.event.release.tag_name }}
release-tag: ${{ inputs.tag_name || github.event.release.tag_name }} release-tag: ${{ inputs.tag_name || github.event.release.tag_name }}
token: ${{ secrets.GITHUB_TOKEN }} token: ${{ secrets.NUSHELL_PAT }}
fork-user: nushell fork-user: nushell

6
.gitignore vendored
View File

@ -32,11 +32,17 @@ unstable_cargo_features.txt
# Helix configuration folder # Helix configuration folder
.helix/* .helix/*
.helix .helix
wix/bin/
wix/obj/
wix/nu/
# Coverage tools # Coverage tools
lcov.info lcov.info
tarpaulin-report.html tarpaulin-report.html
# benchmarking
/tango
# Visual Studio # Visual Studio
.vs/* .vs/*
*.rsproj *.rsproj

381
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -10,8 +10,8 @@ homepage = "https://www.nushell.sh"
license = "MIT" license = "MIT"
name = "nu" name = "nu"
repository = "https://github.com/nushell/nushell" repository = "https://github.com/nushell/nushell"
rust-version = "1.85.1" rust-version = "1.86.0"
version = "0.105.1" version = "0.105.2"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -63,7 +63,7 @@ members = [
[workspace.dependencies] [workspace.dependencies]
alphanumeric-sort = "1.5" alphanumeric-sort = "1.5"
ansi-str = "0.8" ansi-str = "0.9"
anyhow = "1.0.82" anyhow = "1.0.82"
base64 = "0.22.1" base64 = "0.22.1"
bracoxide = "0.1.6" bracoxide = "0.1.6"
@ -71,7 +71,7 @@ brotli = "7.0"
byteorder = "1.5" byteorder = "1.5"
bytes = "1" bytes = "1"
bytesize = "1.3.3" bytesize = "1.3.3"
calamine = "0.26" calamine = "0.28"
chardetng = "0.1.17" chardetng = "0.1.17"
chrono = { default-features = false, version = "0.4.34" } chrono = { default-features = false, version = "0.4.34" }
chrono-humanize = "0.2.3" chrono-humanize = "0.2.3"
@ -156,14 +156,14 @@ serde_json = "1.0.97"
serde_urlencoded = "0.7.1" serde_urlencoded = "0.7.1"
serde_yaml = "0.9.33" serde_yaml = "0.9.33"
sha2 = "0.10" sha2 = "0.10"
strip-ansi-escapes = "0.2.0" strip-ansi-escapes = "0.2.1"
strum = "0.26" strum = "0.26"
strum_macros = "0.26" strum_macros = "0.26"
syn = "2.0" syn = "2.0"
sysinfo = "0.33" sysinfo = "0.33"
tabled = { version = "0.20", default-features = false } tabled = { version = "0.20", default-features = false }
tempfile = "3.20" tempfile = "3.20"
titlecase = "3.5" titlecase = "3.6"
toml = "0.8" toml = "0.8"
trash = "5.2" trash = "5.2"
update-informer = { version = "1.2.0", default-features = false, features = ["github", "ureq"] } update-informer = { version = "1.2.0", default-features = false, features = ["github", "ureq"] }
@ -184,7 +184,7 @@ uuid = "1.16.0"
v_htmlescape = "0.15.0" v_htmlescape = "0.15.0"
wax = "0.6" wax = "0.6"
web-time = "1.1.0" web-time = "1.1.0"
which = "7.0.0" which = "8.0.0"
windows = "0.56" windows = "0.56"
windows-sys = "0.48" windows-sys = "0.48"
winreg = "0.52" winreg = "0.52"
@ -195,27 +195,28 @@ webpki-roots = "1.0"
# Warning: workspace lints affect library code as well as tests, so don't enable lints that would be too noisy in tests like that. # Warning: workspace lints affect library code as well as tests, so don't enable lints that would be too noisy in tests like that.
# todo = "warn" # todo = "warn"
unchecked_duration_subtraction = "warn" unchecked_duration_subtraction = "warn"
used_underscore_binding = "warn"
[lints] [lints]
workspace = true workspace = true
[dependencies] [dependencies]
nu-cli = { path = "./crates/nu-cli", version = "0.105.1" } nu-cli = { path = "./crates/nu-cli", version = "0.105.2" }
nu-cmd-base = { path = "./crates/nu-cmd-base", version = "0.105.1" } nu-cmd-base = { path = "./crates/nu-cmd-base", version = "0.105.2" }
nu-cmd-lang = { path = "./crates/nu-cmd-lang", version = "0.105.1" } nu-cmd-lang = { path = "./crates/nu-cmd-lang", version = "0.105.2" }
nu-cmd-plugin = { path = "./crates/nu-cmd-plugin", version = "0.105.1", optional = true } nu-cmd-plugin = { path = "./crates/nu-cmd-plugin", version = "0.105.2", optional = true }
nu-cmd-extra = { path = "./crates/nu-cmd-extra", version = "0.105.1" } nu-cmd-extra = { path = "./crates/nu-cmd-extra", version = "0.105.2" }
nu-command = { path = "./crates/nu-command", version = "0.105.1", default-features = false, features = ["os"] } nu-command = { path = "./crates/nu-command", version = "0.105.2", default-features = false, features = ["os"] }
nu-engine = { path = "./crates/nu-engine", version = "0.105.1" } nu-engine = { path = "./crates/nu-engine", version = "0.105.2" }
nu-explore = { path = "./crates/nu-explore", version = "0.105.1" } nu-explore = { path = "./crates/nu-explore", version = "0.105.2" }
nu-lsp = { path = "./crates/nu-lsp/", version = "0.105.1" } nu-lsp = { path = "./crates/nu-lsp/", version = "0.105.2" }
nu-parser = { path = "./crates/nu-parser", version = "0.105.1" } nu-parser = { path = "./crates/nu-parser", version = "0.105.2" }
nu-path = { path = "./crates/nu-path", version = "0.105.1" } nu-path = { path = "./crates/nu-path", version = "0.105.2" }
nu-plugin-engine = { path = "./crates/nu-plugin-engine", optional = true, version = "0.105.1" } nu-plugin-engine = { path = "./crates/nu-plugin-engine", optional = true, version = "0.105.2" }
nu-protocol = { path = "./crates/nu-protocol", version = "0.105.1" } nu-protocol = { path = "./crates/nu-protocol", version = "0.105.2" }
nu-std = { path = "./crates/nu-std", version = "0.105.1" } nu-std = { path = "./crates/nu-std", version = "0.105.2" }
nu-system = { path = "./crates/nu-system", version = "0.105.1" } nu-system = { path = "./crates/nu-system", version = "0.105.2" }
nu-utils = { path = "./crates/nu-utils", version = "0.105.1" } nu-utils = { path = "./crates/nu-utils", version = "0.105.2" }
reedline = { workspace = true, features = ["bashisms", "sqlite"] } reedline = { workspace = true, features = ["bashisms", "sqlite"] }
crossterm = { workspace = true } crossterm = { workspace = true }
@ -244,9 +245,9 @@ nix = { workspace = true, default-features = false, features = [
] } ] }
[dev-dependencies] [dev-dependencies]
nu-test-support = { path = "./crates/nu-test-support", version = "0.105.1" } nu-test-support = { path = "./crates/nu-test-support", version = "0.105.2" }
nu-plugin-protocol = { path = "./crates/nu-plugin-protocol", version = "0.105.1" } nu-plugin-protocol = { path = "./crates/nu-plugin-protocol", version = "0.105.2" }
nu-plugin-core = { path = "./crates/nu-plugin-core", version = "0.105.1" } nu-plugin-core = { path = "./crates/nu-plugin-core", version = "0.105.2" }
assert_cmd = "2.0" assert_cmd = "2.0"
dirs = { workspace = true } dirs = { workspace = true }
tango-bench = "0.6" tango-bench = "0.6"
@ -257,10 +258,14 @@ serial_test = "3.2"
tempfile = { workspace = true } tempfile = { workspace = true }
[features] [features]
# Enable all features while still avoiding mutually exclusive features.
# Use this if `--all-features` fails.
full = ["plugin", "rustls-tls", "system-clipboard", "trash-support", "sqlite"]
plugin = [ plugin = [
# crates # crates
"nu-cmd-plugin", "dep:nu-cmd-plugin",
"nu-plugin-engine", "dep:nu-plugin-engine",
# features # features
"nu-cli/plugin", "nu-cli/plugin",
@ -286,21 +291,20 @@ stable = ["default"]
# Enable to statically link OpenSSL (perl is required, to build OpenSSL https://docs.rs/openssl/latest/openssl/); # Enable to statically link OpenSSL (perl is required, to build OpenSSL https://docs.rs/openssl/latest/openssl/);
# otherwise the system version will be used. Not enabled by default because it takes a while to build # otherwise the system version will be used. Not enabled by default because it takes a while to build
static-link-openssl = ["dep:openssl", "nu-cmd-lang/static-link-openssl"] static-link-openssl = ["dep:openssl"]
# Optional system clipboard support in `reedline`, this behavior has problematic compatibility with some systems. # Optional system clipboard support in `reedline`, this behavior has problematic compatibility with some systems.
# Missing X server/ Wayland can cause issues # Missing X server/ Wayland can cause issues
system-clipboard = [ system-clipboard = [
"reedline/system_clipboard", "reedline/system_clipboard",
"nu-cli/system-clipboard", "nu-cli/system-clipboard",
"nu-cmd-lang/system-clipboard",
] ]
# Stable (Default) # Stable (Default)
trash-support = ["nu-command/trash-support", "nu-cmd-lang/trash-support"] trash-support = ["nu-command/trash-support"]
# SQLite commands for nushell # SQLite commands for nushell
sqlite = ["nu-command/sqlite", "nu-cmd-lang/sqlite", "nu-std/sqlite"] sqlite = ["nu-command/sqlite", "nu-std/sqlite"]
[profile.release] [profile.release]
opt-level = "s" # Optimize for size opt-level = "s" # Optimize for size
@ -330,7 +334,7 @@ bench = false
# To use a development version of a dependency please use a global override here # To use a development version of a dependency please use a global override here
# changing versions in each sub-crate of the workspace is tedious # changing versions in each sub-crate of the workspace is tedious
[patch.crates-io] [patch.crates-io]
# reedline = { git = "https://github.com/nushell/reedline", branch = "main" } reedline = { git = "https://github.com/nushell/reedline", branch = "main" }
# nu-ansi-term = {git = "https://github.com/nushell/nu-ansi-term.git", branch = "main"} # nu-ansi-term = {git = "https://github.com/nushell/nu-ansi-term.git", branch = "main"}
# Run all benchmarks with `cargo bench` # Run all benchmarks with `cargo bench`

View File

@ -199,7 +199,7 @@ fn bench_record_nested_access(n: usize) -> impl IntoBenchmarks {
let nested_access = ".col".repeat(n); let nested_access = ".col".repeat(n);
bench_command( bench_command(
format!("record_nested_access_{n}"), format!("record_nested_access_{n}"),
format!("$record{} | ignore", nested_access), format!("$record{nested_access} | ignore"),
stack, stack,
engine, engine,
) )
@ -319,7 +319,7 @@ fn bench_eval_par_each(n: usize) -> impl IntoBenchmarks {
let stack = Stack::new(); let stack = Stack::new();
bench_command( bench_command(
format!("eval_par_each_{n}"), format!("eval_par_each_{n}"),
format!("(1..{}) | par-each -t 2 {{|_| 1 }} | ignore", n), format!("(1..{n}) | par-each -t 2 {{|_| 1 }} | ignore"),
stack, stack,
engine, engine,
) )
@ -357,7 +357,7 @@ fn encode_json(row_cnt: usize, col_cnt: usize) -> impl IntoBenchmarks {
let encoder = Rc::new(EncodingType::try_from_bytes(b"json").unwrap()); let encoder = Rc::new(EncodingType::try_from_bytes(b"json").unwrap());
[benchmark_fn( [benchmark_fn(
format!("encode_json_{}_{}", row_cnt, col_cnt), format!("encode_json_{row_cnt}_{col_cnt}"),
move |b| { move |b| {
let encoder = encoder.clone(); let encoder = encoder.clone();
let test_data = test_data.clone(); let test_data = test_data.clone();
@ -377,7 +377,7 @@ fn encode_msgpack(row_cnt: usize, col_cnt: usize) -> impl IntoBenchmarks {
let encoder = Rc::new(EncodingType::try_from_bytes(b"msgpack").unwrap()); let encoder = Rc::new(EncodingType::try_from_bytes(b"msgpack").unwrap());
[benchmark_fn( [benchmark_fn(
format!("encode_msgpack_{}_{}", row_cnt, col_cnt), format!("encode_msgpack_{row_cnt}_{col_cnt}"),
move |b| { move |b| {
let encoder = encoder.clone(); let encoder = encoder.clone();
let test_data = test_data.clone(); let test_data = test_data.clone();
@ -399,7 +399,7 @@ fn decode_json(row_cnt: usize, col_cnt: usize) -> impl IntoBenchmarks {
encoder.encode(&test_data, &mut res).unwrap(); encoder.encode(&test_data, &mut res).unwrap();
[benchmark_fn( [benchmark_fn(
format!("decode_json_{}_{}", row_cnt, col_cnt), format!("decode_json_{row_cnt}_{col_cnt}"),
move |b| { move |b| {
let res = res.clone(); let res = res.clone();
b.iter(move || { b.iter(move || {
@ -422,7 +422,7 @@ fn decode_msgpack(row_cnt: usize, col_cnt: usize) -> impl IntoBenchmarks {
encoder.encode(&test_data, &mut res).unwrap(); encoder.encode(&test_data, &mut res).unwrap();
[benchmark_fn( [benchmark_fn(
format!("decode_msgpack_{}_{}", row_cnt, col_cnt), format!("decode_msgpack_{row_cnt}_{col_cnt}"),
move |b| { move |b| {
let res = res.clone(); let res = res.clone();
b.iter(move || { b.iter(move || {

View File

@ -5,29 +5,29 @@ repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cli"
edition = "2024" edition = "2024"
license = "MIT" license = "MIT"
name = "nu-cli" name = "nu-cli"
version = "0.105.1" version = "0.105.2"
[lib] [lib]
bench = false bench = false
[dev-dependencies] [dev-dependencies]
nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.105.1" } nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.105.2" }
nu-command = { path = "../nu-command", version = "0.105.1" } nu-command = { path = "../nu-command", version = "0.105.2" }
nu-std = { path = "../nu-std", version = "0.105.1" } nu-std = { path = "../nu-std", version = "0.105.2" }
nu-test-support = { path = "../nu-test-support", version = "0.105.1" } nu-test-support = { path = "../nu-test-support", version = "0.105.2" }
rstest = { workspace = true, default-features = false } rstest = { workspace = true, default-features = false }
tempfile = { workspace = true } tempfile = { workspace = true }
[dependencies] [dependencies]
nu-cmd-base = { path = "../nu-cmd-base", version = "0.105.1" } nu-cmd-base = { path = "../nu-cmd-base", version = "0.105.2" }
nu-engine = { path = "../nu-engine", version = "0.105.1", features = ["os"] } nu-engine = { path = "../nu-engine", version = "0.105.2", features = ["os"] }
nu-glob = { path = "../nu-glob", version = "0.105.1" } nu-glob = { path = "../nu-glob", version = "0.105.2" }
nu-path = { path = "../nu-path", version = "0.105.1" } nu-path = { path = "../nu-path", version = "0.105.2" }
nu-parser = { path = "../nu-parser", version = "0.105.1" } nu-parser = { path = "../nu-parser", version = "0.105.2" }
nu-plugin-engine = { path = "../nu-plugin-engine", version = "0.105.1", optional = true } nu-plugin-engine = { path = "../nu-plugin-engine", version = "0.105.2", optional = true }
nu-protocol = { path = "../nu-protocol", version = "0.105.1", features = ["os"] } nu-protocol = { path = "../nu-protocol", version = "0.105.2", features = ["os"] }
nu-utils = { path = "../nu-utils", version = "0.105.1" } nu-utils = { path = "../nu-utils", version = "0.105.2" }
nu-color-config = { path = "../nu-color-config", version = "0.105.1" } nu-color-config = { path = "../nu-color-config", version = "0.105.2" }
nu-ansi-term = { workspace = true } nu-ansi-term = { workspace = true }
reedline = { workspace = true, features = ["bashisms", "sqlite"] } reedline = { workspace = true, features = ["bashisms", "sqlite"] }

View File

@ -118,7 +118,7 @@ fn get_suggestions_by_value(
|| s.chars() || s.chars()
.any(|c: char| !(c.is_ascii_alphabetic() || ['_', '-'].contains(&c))) .any(|c: char| !(c.is_ascii_alphabetic() || ['_', '-'].contains(&c)))
{ {
format!("{:?}", s) format!("{s:?}")
} else { } else {
s s
}; };

View File

@ -52,7 +52,7 @@ impl CommandCompletion {
continue; continue;
}; };
let value = if matched_internal(&name) { let value = if matched_internal(&name) {
format!("^{}", name) format!("^{name}")
} else { } else {
name.clone() name.clone()
}; };

View File

@ -176,7 +176,7 @@ impl NuCompleter {
&mut working_set, &mut working_set,
Some("completer"), Some("completer"),
// Add a placeholder `a` to the end // Add a placeholder `a` to the end
format!("{}a", line).as_bytes(), format!("{line}a").as_bytes(),
false, false,
); );
self.fetch_completions_by_block(block, &working_set, pos, offset, line, true) self.fetch_completions_by_block(block, &working_set, pos, offset, line, true)
@ -466,6 +466,14 @@ impl NuCompleter {
return suggestions; return suggestions;
} }
} }
// for external path arguments with spaces, please check issue #15790
if suggestions.is_empty() {
let (new_span, prefix) =
strip_placeholder_if_any(working_set, &span, strip);
let ctx = Context::new(working_set, new_span, prefix, offset);
suggestions.extend(self.process_completion(&mut FileCompletion, &ctx));
return suggestions;
}
break; break;
} }
} }
@ -842,7 +850,7 @@ mod completer_tests {
for (line, has_result, begins_with, expected_values) in dataset { for (line, has_result, begins_with, expected_values) in dataset {
let result = completer.fetch_completions_at(line, line.len()); let result = completer.fetch_completions_at(line, line.len());
// Test whether the result is empty or not // Test whether the result is empty or not
assert_eq!(!result.is_empty(), has_result, "line: {}", line); assert_eq!(!result.is_empty(), has_result, "line: {line}");
// Test whether the result begins with the expected value // Test whether the result begins with the expected value
result result
@ -857,8 +865,7 @@ mod completer_tests {
.filter(|x| *x) .filter(|x| *x)
.count(), .count(),
expected_values.len(), expected_values.len(),
"line: {}", "line: {line}"
line
); );
} }
} }

View File

@ -314,7 +314,7 @@ pub fn escape_path(path: String) -> String {
if path.contains('\'') { if path.contains('\'') {
// decide to use double quotes // decide to use double quotes
// Path as Debug will do the escaping for `"`, `\` // Path as Debug will do the escaping for `"`, `\`
format!("{:?}", path) format!("{path:?}")
} else { } else {
format!("'{path}'") format!("'{path}'")
} }

View File

@ -129,7 +129,7 @@ impl Completer for DotNuCompletion {
.take_while(|c| "`'\"".contains(*c)) .take_while(|c| "`'\"".contains(*c))
.collect::<String>(); .collect::<String>();
for path in ["std", "std-rfc"] { for path in ["std", "std-rfc"] {
let path = format!("{}{}", surround_prefix, path); let path = format!("{surround_prefix}{path}");
matcher.add( matcher.add(
path.clone(), path.clone(),
FileSuggestion { FileSuggestion {
@ -146,7 +146,7 @@ impl Completer for DotNuCompletion {
for sub_vp_id in sub_paths { for sub_vp_id in sub_paths {
let (path, sub_vp) = working_set.get_virtual_path(*sub_vp_id); let (path, sub_vp) = working_set.get_virtual_path(*sub_vp_id);
let path = path let path = path
.strip_prefix(&format!("{}/", base_dir)) .strip_prefix(&format!("{base_dir}/"))
.unwrap_or(path) .unwrap_or(path)
.to_string(); .to_string();
matcher.add( matcher.add(

View File

@ -1,4 +1,4 @@
use nu_engine::documentation::{HelpStyle, get_flags_section}; use nu_engine::documentation::{FormatterValue, HelpStyle, get_flags_section};
use nu_protocol::{Config, engine::EngineState, levenshtein_distance}; use nu_protocol::{Config, engine::EngineState, levenshtein_distance};
use nu_utils::IgnoreCaseExt; use nu_utils::IgnoreCaseExt;
use reedline::{Completer, Suggestion}; use reedline::{Completer, Suggestion};
@ -66,8 +66,11 @@ impl NuHelpCompleter {
let _ = write!(long_desc, "Usage:\r\n > {}\r\n", sig.call_signature()); let _ = write!(long_desc, "Usage:\r\n > {}\r\n", sig.call_signature());
if !sig.named.is_empty() { if !sig.named.is_empty() {
long_desc.push_str(&get_flags_section(&sig, &help_style, |v| { long_desc.push_str(&get_flags_section(&sig, &help_style, |v| match v {
v.to_parsable_string(", ", &self.config) FormatterValue::DefaultValue(value) => {
value.to_parsable_string(", ", &self.config)
}
FormatterValue::CodeString(text) => text.to_string(),
})) }))
} }

View File

@ -3,6 +3,8 @@ use std::sync::Arc;
use nu_engine::command_prelude::*; use nu_engine::command_prelude::*;
use reedline::{Highlighter, StyledText}; use reedline::{Highlighter, StyledText};
use crate::syntax_highlight::highlight_syntax;
#[derive(Clone)] #[derive(Clone)]
pub struct NuHighlight; pub struct NuHighlight;
@ -14,6 +16,11 @@ impl Command for NuHighlight {
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("nu-highlight") Signature::build("nu-highlight")
.category(Category::Strings) .category(Category::Strings)
.switch(
"reject-garbage",
"Return an error if invalid syntax (garbage) was encountered",
Some('r'),
)
.input_output_types(vec![(Type::String, Type::String)]) .input_output_types(vec![(Type::String, Type::String)])
} }
@ -32,19 +39,33 @@ impl Command for NuHighlight {
call: &Call, call: &Call,
input: PipelineData, input: PipelineData,
) -> Result<PipelineData, ShellError> { ) -> Result<PipelineData, ShellError> {
let reject_garbage = call.has_flag(engine_state, stack, "reject-garbage")?;
let head = call.head; let head = call.head;
let signals = engine_state.signals(); let signals = engine_state.signals();
let highlighter = crate::NuHighlighter { let engine_state = Arc::new(engine_state.clone());
engine_state: Arc::new(engine_state.clone()), let stack = Arc::new(stack.clone());
stack: Arc::new(stack.clone()),
};
input.map( input.map(
move |x| match x.coerce_into_string() { move |x| match x.coerce_into_string() {
Ok(line) => { Ok(line) => {
let highlights = highlighter.highlight(&line, line.len()); let result = highlight_syntax(&engine_state, &stack, &line, line.len());
let highlights = match (reject_garbage, result.found_garbage) {
(false, _) => result.text,
(true, None) => result.text,
(true, Some(span)) => {
let error = ShellError::OutsideSpannedLabeledError {
src: line,
error: "encountered invalid syntax while highlighting".into(),
msg: "invalid syntax".into(),
span,
};
return Value::error(error, head);
}
};
Value::string(highlights.render_simple(), head) Value::string(highlights.render_simple(), head)
} }
Err(err) => Value::error(err, head), Err(err) => Value::error(err, head),

View File

@ -22,8 +22,8 @@ use nu_color_config::StyleComputer;
use nu_engine::env_to_strings; use nu_engine::env_to_strings;
use nu_engine::exit::cleanup_exit; use nu_engine::exit::cleanup_exit;
use nu_parser::{lex, parse, trim_quotes_str}; use nu_parser::{lex, parse, trim_quotes_str};
use nu_protocol::shell_error;
use nu_protocol::shell_error::io::IoError; use nu_protocol::shell_error::io::IoError;
use nu_protocol::{BannerKind, shell_error};
use nu_protocol::{ use nu_protocol::{
HistoryConfig, HistoryFileFormat, PipelineData, ShellError, Span, Spanned, Value, HistoryConfig, HistoryFileFormat, PipelineData, ShellError, Span, Spanned, Value,
config::NuCursorShape, config::NuCursorShape,
@ -145,8 +145,8 @@ pub fn evaluate_repl(
if load_std_lib.is_none() { if load_std_lib.is_none() {
match engine_state.get_config().show_banner { match engine_state.get_config().show_banner {
Value::Bool { val: false, .. } => {} BannerKind::None => {}
Value::String { ref val, .. } if val == "short" => { BannerKind::Short => {
eval_source( eval_source(
engine_state, engine_state,
&mut unique_stack, &mut unique_stack,
@ -156,7 +156,7 @@ pub fn evaluate_repl(
false, false,
); );
} }
_ => { BannerKind::Full => {
eval_source( eval_source(
engine_state, engine_state,
&mut unique_stack, &mut unique_stack,
@ -239,7 +239,7 @@ fn escape_special_vscode_bytes(input: &str) -> Result<String, ShellError> {
match byte { match byte {
// Escape bytes below 0x20 // Escape bytes below 0x20
b if b < 0x20 => format!("\\x{:02X}", byte).into_bytes(), b if b < 0x20 => format!("\\x{byte:02X}").into_bytes(),
// Escape semicolon as \x3B // Escape semicolon as \x3B
b';' => "\\x3B".to_string().into_bytes(), b';' => "\\x3B".to_string().into_bytes(),
// Escape backslash as \\ // Escape backslash as \\
@ -1097,8 +1097,7 @@ fn run_shell_integration_osc633(
// If we're in vscode, run their specific ansi escape sequence. // If we're in vscode, run their specific ansi escape sequence.
// This is helpful for ctrl+g to change directories in the terminal. // This is helpful for ctrl+g to change directories in the terminal.
run_ansi_sequence(&format!( run_ansi_sequence(&format!(
"{}{}{}", "{VSCODE_CWD_PROPERTY_MARKER_PREFIX}{path}{VSCODE_CWD_PROPERTY_MARKER_SUFFIX}"
VSCODE_CWD_PROPERTY_MARKER_PREFIX, path, VSCODE_CWD_PROPERTY_MARKER_SUFFIX
)); ));
perf!( perf!(
@ -1114,10 +1113,7 @@ fn run_shell_integration_osc633(
//OSC 633 ; E ; <commandline> [; <nonce] ST - Explicitly set the command line with an optional nonce. //OSC 633 ; E ; <commandline> [; <nonce] ST - Explicitly set the command line with an optional nonce.
run_ansi_sequence(&format!( run_ansi_sequence(&format!(
"{}{}{}", "{VSCODE_COMMANDLINE_MARKER_PREFIX}{replaced_cmd_text}{VSCODE_COMMANDLINE_MARKER_SUFFIX}"
VSCODE_COMMANDLINE_MARKER_PREFIX,
replaced_cmd_text,
VSCODE_COMMANDLINE_MARKER_SUFFIX
)); ));
} }
} }
@ -1493,7 +1489,7 @@ mod test_auto_cd {
// Parse the input. It must be an auto-cd operation. // Parse the input. It must be an auto-cd operation.
let op = parse_operation(input.to_string(), &engine_state, &stack).unwrap(); let op = parse_operation(input.to_string(), &engine_state, &stack).unwrap();
let ReplOperation::AutoCd { cwd, target, span } = op else { let ReplOperation::AutoCd { cwd, target, span } = op else {
panic!("'{}' was not parsed into an auto-cd operation", input) panic!("'{input}' was not parsed into an auto-cd operation")
}; };
// Perform the auto-cd operation. // Perform the auto-cd operation.

View File

@ -17,12 +17,32 @@ pub struct NuHighlighter {
} }
impl Highlighter for NuHighlighter { impl Highlighter for NuHighlighter {
fn highlight(&self, line: &str, _cursor: usize) -> StyledText { fn highlight(&self, line: &str, cursor: usize) -> StyledText {
let result = highlight_syntax(&self.engine_state, &self.stack, line, cursor);
result.text
}
}
/// Result of a syntax highlight operation
#[derive(Default)]
pub(crate) struct HighlightResult {
/// The highlighted text
pub(crate) text: StyledText,
/// The span of any garbage that was highlighted
pub(crate) found_garbage: Option<Span>,
}
pub(crate) fn highlight_syntax(
engine_state: &EngineState,
stack: &Stack,
line: &str,
cursor: usize,
) -> HighlightResult {
trace!("highlighting: {}", line); trace!("highlighting: {}", line);
let config = self.stack.get_config(&self.engine_state); let config = stack.get_config(engine_state);
let highlight_resolved_externals = config.highlight_resolved_externals; let highlight_resolved_externals = config.highlight_resolved_externals;
let mut working_set = StateWorkingSet::new(&self.engine_state); let mut working_set = StateWorkingSet::new(engine_state);
let block = parse(&mut working_set, None, line.as_bytes(), false); let block = parse(&mut working_set, None, line.as_bytes(), false);
let (shapes, global_span_offset) = { let (shapes, global_span_offset) = {
let mut shapes = flatten_block(&working_set, &block); let mut shapes = flatten_block(&working_set, &block);
@ -35,11 +55,8 @@ impl Highlighter for NuHighlighter {
working_set.get_span_contents(Span::new(span.start, span.end)); working_set.get_span_contents(Span::new(span.start, span.end));
let str_word = String::from_utf8_lossy(str_contents).to_string(); let str_word = String::from_utf8_lossy(str_contents).to_string();
let paths = env::path_str(&self.engine_state, &self.stack, *span).ok(); let paths = env::path_str(engine_state, stack, *span).ok();
#[allow(deprecated)] let res = if let Ok(cwd) = engine_state.cwd(Some(stack)) {
let res = if let Ok(cwd) =
env::current_dir_str(&self.engine_state, &self.stack)
{
which::which_in(str_word, paths.as_ref(), cwd).ok() which::which_in(str_word, paths.as_ref(), cwd).ok()
} else { } else {
which::which_in_global(str_word, paths.as_ref()) which::which_in_global(str_word, paths.as_ref())
@ -52,13 +69,13 @@ impl Highlighter for NuHighlighter {
} }
} }
} }
(shapes, self.engine_state.next_span_start()) (shapes, engine_state.next_span_start())
}; };
let mut output = StyledText::default(); let mut result = HighlightResult::default();
let mut last_seen_span = global_span_offset; let mut last_seen_span = global_span_offset;
let global_cursor_offset = _cursor + global_span_offset; let global_cursor_offset = cursor + global_span_offset;
let matching_brackets_pos = find_matching_brackets( let matching_brackets_pos = find_matching_brackets(
line, line,
&working_set, &working_set,
@ -80,18 +97,28 @@ impl Highlighter for NuHighlighter {
let gap = line let gap = line
[(last_seen_span - global_span_offset)..(shape.0.start - global_span_offset)] [(last_seen_span - global_span_offset)..(shape.0.start - global_span_offset)]
.to_string(); .to_string();
output.push((Style::new(), gap)); result.text.push((Style::new(), gap));
} }
let next_token = line let next_token = line
[(shape.0.start - global_span_offset)..(shape.0.end - global_span_offset)] [(shape.0.start - global_span_offset)..(shape.0.end - global_span_offset)]
.to_string(); .to_string();
let mut add_colored_token = |shape: &FlatShape, text: String| { let mut add_colored_token = |shape: &FlatShape, text: String| {
output.push((get_shape_color(shape.as_str(), &config), text)); result
.text
.push((get_shape_color(shape.as_str(), &config), text));
}; };
match shape.1 { match shape.1 {
FlatShape::Garbage => add_colored_token(&shape.1, next_token), FlatShape::Garbage => {
result.found_garbage.get_or_insert_with(|| {
Span::new(
shape.0.start - global_span_offset,
shape.0.end - global_span_offset,
)
});
add_colored_token(&shape.1, next_token)
}
FlatShape::Nothing => add_colored_token(&shape.1, next_token), FlatShape::Nothing => add_colored_token(&shape.1, next_token),
FlatShape::Binary => add_colored_token(&shape.1, next_token), FlatShape::Binary => add_colored_token(&shape.1, next_token),
FlatShape::Bool => add_colored_token(&shape.1, next_token), FlatShape::Bool => add_colored_token(&shape.1, next_token),
@ -131,7 +158,7 @@ impl Highlighter for NuHighlighter {
if highlight { if highlight {
style = get_matching_brackets_style(style, &config); style = get_matching_brackets_style(style, &config);
} }
output.push((style, text)); result.text.push((style, text));
} }
} }
@ -153,11 +180,10 @@ impl Highlighter for NuHighlighter {
let remainder = line[(last_seen_span - global_span_offset)..].to_string(); let remainder = line[(last_seen_span - global_span_offset)..].to_string();
if !remainder.is_empty() { if !remainder.is_empty() {
output.push((Style::new(), remainder)); result.text.push((Style::new(), remainder));
} }
output result
}
} }
fn split_span_by_highlight_positions( fn split_span_by_highlight_positions(

View File

@ -9,14 +9,16 @@ use std::{
use nu_cli::NuCompleter; use nu_cli::NuCompleter;
use nu_engine::eval_block; use nu_engine::eval_block;
use nu_parser::parse; use nu_parser::parse;
use nu_path::expand_tilde; use nu_path::{AbsolutePathBuf, expand_tilde};
use nu_protocol::{Config, PipelineData, debugger::WithoutDebug, engine::StateWorkingSet}; use nu_protocol::{Config, PipelineData, debugger::WithoutDebug, engine::StateWorkingSet};
use nu_std::load_standard_library; use nu_std::load_standard_library;
use nu_test_support::fs;
use reedline::{Completer, Suggestion}; use reedline::{Completer, Suggestion};
use rstest::{fixture, rstest}; use rstest::{fixture, rstest};
use support::{ use support::{
completions_helpers::{ completions_helpers::{
new_dotnu_engine, new_external_engine, new_partial_engine, new_quote_engine, new_dotnu_engine, new_engine_helper, new_external_engine, new_partial_engine,
new_quote_engine,
}, },
file, folder, match_suggestions, match_suggestions_by_string, new_engine, file, folder, match_suggestions, match_suggestions_by_string, new_engine,
}; };
@ -123,7 +125,7 @@ fn custom_completer_with_options(
global_opts, global_opts,
completions completions
.iter() .iter()
.map(|comp| format!("'{}'", comp)) .map(|comp| format!("'{comp}'"))
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(", "), .join(", "),
completer_opts, completer_opts,
@ -716,6 +718,16 @@ fn external_completer_fallback() {
let expected = [folder("test_a"), file("test_a_symlink"), folder("test_b")]; let expected = [folder("test_a"), file("test_a_symlink"), folder("test_b")];
let suggestions = run_external_completion(block, input); let suggestions = run_external_completion(block, input);
match_suggestions_by_string(&expected, &suggestions); match_suggestions_by_string(&expected, &suggestions);
// issue #15790
let input = "foo `dir with space/`";
let expected = vec!["`dir with space/bar baz`", "`dir with space/foo`"];
let suggestions = run_external_completion_within_pwd(
block,
input,
fs::fixtures().join("external_completions"),
);
match_suggestions(&expected, &suggestions);
} }
/// Fallback to external completions for flags of `sudo` /// Fallback to external completions for flags of `sudo`
@ -2103,11 +2115,15 @@ fn alias_of_another_alias() {
match_suggestions(&expected_paths, &suggestions) match_suggestions(&expected_paths, &suggestions)
} }
fn run_external_completion(completer: &str, input: &str) -> Vec<Suggestion> { fn run_external_completion_within_pwd(
completer: &str,
input: &str,
pwd: AbsolutePathBuf,
) -> Vec<Suggestion> {
let completer = format!("$env.config.completions.external.completer = {completer}"); let completer = format!("$env.config.completions.external.completer = {completer}");
// Create a new engine // Create a new engine
let (_, _, mut engine_state, mut stack) = new_engine(); let (_, _, mut engine_state, mut stack) = new_engine_helper(pwd);
let (block, delta) = { let (block, delta) = {
let mut working_set = StateWorkingSet::new(&engine_state); let mut working_set = StateWorkingSet::new(&engine_state);
let block = parse(&mut working_set, None, completer.as_bytes(), false); let block = parse(&mut working_set, None, completer.as_bytes(), false);
@ -2131,6 +2147,10 @@ fn run_external_completion(completer: &str, input: &str) -> Vec<Suggestion> {
completer.complete(input, input.len()) completer.complete(input, input.len())
} }
fn run_external_completion(completer: &str, input: &str) -> Vec<Suggestion> {
run_external_completion_within_pwd(completer, input, fs::fixtures().join("completions"))
}
#[test] #[test]
fn unknown_command_completion() { fn unknown_command_completion() {
let (_, _, engine, stack) = new_engine(); let (_, _, engine, stack) = new_engine();

View File

@ -5,7 +5,7 @@ edition = "2024"
license = "MIT" license = "MIT"
name = "nu-cmd-base" name = "nu-cmd-base"
repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cmd-base" repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cmd-base"
version = "0.105.1" version = "0.105.2"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -13,10 +13,10 @@ version = "0.105.1"
workspace = true workspace = true
[dependencies] [dependencies]
nu-engine = { path = "../nu-engine", version = "0.105.1", default-features = false } nu-engine = { path = "../nu-engine", version = "0.105.2", default-features = false }
nu-parser = { path = "../nu-parser", version = "0.105.1" } nu-parser = { path = "../nu-parser", version = "0.105.2" }
nu-path = { path = "../nu-path", version = "0.105.1" } nu-path = { path = "../nu-path", version = "0.105.2" }
nu-protocol = { path = "../nu-protocol", version = "0.105.1", default-features = false } nu-protocol = { path = "../nu-protocol", version = "0.105.2", default-features = false }
indexmap = { workspace = true } indexmap = { workspace = true }
miette = { workspace = true } miette = { workspace = true }

View File

@ -1,5 +1,5 @@
use miette::Result; use miette::Result;
use nu_engine::{eval_block, eval_block_with_early_return}; use nu_engine::{eval_block, eval_block_with_early_return, redirect_env};
use nu_parser::parse; use nu_parser::parse;
use nu_protocol::{ use nu_protocol::{
PipelineData, PositionalArg, ShellError, Span, Type, Value, VarId, PipelineData, PositionalArg, ShellError, Span, Type, Value, VarId,
@ -325,19 +325,7 @@ fn run_hook(
} }
// If all went fine, preserve the environment of the called block // If all went fine, preserve the environment of the called block
let caller_env_vars = stack.get_env_var_names(engine_state); redirect_env(engine_state, stack, &callee_stack);
// remove env vars that are present in the caller but not in the callee
// (the callee hid them)
for var in caller_env_vars.iter() {
if !callee_stack.has_env_var(engine_state, var) {
stack.remove_env_var(engine_state, var);
}
}
// add new env vars from callee to caller
for (var, value) in callee_stack.get_stack_env_vars() {
stack.add_env_var(var, value);
}
Ok(pipeline_data) Ok(pipeline_data)
} }

View File

@ -5,7 +5,7 @@ edition = "2024"
license = "MIT" license = "MIT"
name = "nu-cmd-extra" name = "nu-cmd-extra"
repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cmd-extra" repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cmd-extra"
version = "0.105.1" version = "0.105.2"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -16,13 +16,13 @@ bench = false
workspace = true workspace = true
[dependencies] [dependencies]
nu-cmd-base = { path = "../nu-cmd-base", version = "0.105.1" } nu-cmd-base = { path = "../nu-cmd-base", version = "0.105.2" }
nu-engine = { path = "../nu-engine", version = "0.105.1", default-features = false } nu-engine = { path = "../nu-engine", version = "0.105.2", default-features = false }
nu-json = { version = "0.105.1", path = "../nu-json" } nu-json = { version = "0.105.2", path = "../nu-json" }
nu-parser = { path = "../nu-parser", version = "0.105.1" } nu-parser = { path = "../nu-parser", version = "0.105.2" }
nu-pretty-hex = { version = "0.105.1", path = "../nu-pretty-hex" } nu-pretty-hex = { version = "0.105.2", path = "../nu-pretty-hex" }
nu-protocol = { path = "../nu-protocol", version = "0.105.1", default-features = false } nu-protocol = { path = "../nu-protocol", version = "0.105.2", default-features = false }
nu-utils = { path = "../nu-utils", version = "0.105.1", default-features = false } nu-utils = { path = "../nu-utils", version = "0.105.2", default-features = false }
# Potential dependencies for extras # Potential dependencies for extras
heck = { workspace = true } heck = { workspace = true }
@ -37,6 +37,6 @@ itertools = { workspace = true }
mime = { workspace = true } mime = { workspace = true }
[dev-dependencies] [dev-dependencies]
nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.105.1" } nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.105.2" }
nu-command = { path = "../nu-command", version = "0.105.1" } nu-command = { path = "../nu-command", version = "0.105.2" }
nu-test-support = { path = "../nu-test-support", version = "0.105.1" } nu-test-support = { path = "../nu-test-support", version = "0.105.2" }

View File

@ -12,7 +12,10 @@ impl Command for UpdateCells {
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("update cells") Signature::build("update cells")
.input_output_types(vec![(Type::table(), Type::table())]) .input_output_types(vec![
(Type::table(), Type::table()),
(Type::record(), Type::record()),
])
.required( .required(
"closure", "closure",
SyntaxShape::Closure(Some(vec![SyntaxShape::Any])), SyntaxShape::Closure(Some(vec![SyntaxShape::Any])),
@ -77,6 +80,15 @@ impl Command for UpdateCells {
"2021-11-18" => Value::test_string(""), "2021-11-18" => Value::test_string(""),
})])), })])),
}, },
Example {
example: r#"{a: 1, b: 2, c: 3} | update cells { $in + 10 }"#,
description: "Update each value in a record.",
result: Some(Value::test_record(record! {
"a" => Value::test_int(11),
"b" => Value::test_int(12),
"c" => Value::test_int(13),
})),
},
] ]
} }
@ -85,7 +97,7 @@ impl Command for UpdateCells {
engine_state: &EngineState, engine_state: &EngineState,
stack: &mut Stack, stack: &mut Stack,
call: &Call, call: &Call,
input: PipelineData, mut input: PipelineData,
) -> Result<PipelineData, ShellError> { ) -> Result<PipelineData, ShellError> {
let head = call.head; let head = call.head;
let closure: Closure = call.req(engine_state, stack, 0)?; let closure: Closure = call.req(engine_state, stack, 0)?;
@ -102,14 +114,51 @@ impl Command for UpdateCells {
let metadata = input.metadata(); let metadata = input.metadata();
Ok(UpdateCellIterator { match input {
PipelineData::Value(
Value::Record {
ref mut val,
internal_span,
},
..,
) => {
let val = val.to_mut();
update_record(
val,
&mut ClosureEval::new(engine_state, stack, closure),
internal_span,
columns.as_ref(),
);
Ok(input)
}
_ => Ok(UpdateCellIterator {
iter: input.into_iter(), iter: input.into_iter(),
closure: ClosureEval::new(engine_state, stack, closure), closure: ClosureEval::new(engine_state, stack, closure),
columns, columns,
span: head, span: head,
} }
.into_pipeline_data(head, engine_state.signals().clone()) .into_pipeline_data(head, engine_state.signals().clone())
.set_metadata(metadata)) .set_metadata(metadata)),
}
}
}
fn update_record(
record: &mut Record,
closure: &mut ClosureEval,
span: Span,
cols: Option<&HashSet<String>>,
) {
if let Some(columns) = cols {
for (col, val) in record.iter_mut() {
if columns.contains(col) {
*val = eval_value(closure, span, std::mem::take(val));
}
}
} else {
for (_, val) in record.iter_mut() {
*val = eval_value(closure, span, std::mem::take(val))
}
} }
} }
@ -128,18 +177,7 @@ impl Iterator for UpdateCellIterator {
let value = if let Value::Record { val, .. } = &mut value { let value = if let Value::Record { val, .. } = &mut value {
let val = val.to_mut(); let val = val.to_mut();
if let Some(columns) = &self.columns { update_record(val, &mut self.closure, self.span, self.columns.as_ref());
for (col, val) in val.iter_mut() {
if columns.contains(col) {
*val = eval_value(&mut self.closure, self.span, std::mem::take(val));
}
}
} else {
for (_, val) in val.iter_mut() {
*val = eval_value(&mut self.closure, self.span, std::mem::take(val))
}
}
value value
} else { } else {
eval_value(&mut self.closure, self.span, value) eval_value(&mut self.closure, self.span, value)

View File

@ -188,7 +188,7 @@ fn get_theme_from_asset_file(
Some(t) => t, Some(t) => t,
None => { None => {
return Err(ShellError::TypeMismatch { return Err(ShellError::TypeMismatch {
err_message: format!("Unknown HTML theme '{}'", theme_name), err_message: format!("Unknown HTML theme '{theme_name}'"),
span: theme_span, span: theme_span,
}); });
} }
@ -774,8 +774,7 @@ mod tests {
for key in required_keys { for key in required_keys {
assert!( assert!(
theme_map.contains_key(key), theme_map.contains_key(key),
"Expected theme to contain key '{}'", "Expected theme to contain key '{key}'"
key
); );
} }
} }
@ -792,15 +791,13 @@ mod tests {
if let Err(err) = result { if let Err(err) = result {
assert!( assert!(
matches!(err, ShellError::TypeMismatch { .. }), matches!(err, ShellError::TypeMismatch { .. }),
"Expected TypeMismatch error, got: {:?}", "Expected TypeMismatch error, got: {err:?}"
err
); );
if let ShellError::TypeMismatch { err_message, span } = err { if let ShellError::TypeMismatch { err_message, span } = err {
assert!( assert!(
err_message.contains("doesnt-exist"), err_message.contains("doesnt-exist"),
"Error message should mention theme name, got: {}", "Error message should mention theme name, got: {err_message}"
err_message
); );
assert_eq!(span.start, 0); assert_eq!(span.start, 0);
assert_eq!(span.end, 13); assert_eq!(span.end, 13);

View File

@ -161,28 +161,28 @@ fn convert_to_smallest_number_type(num: i64, span: Span) -> Value {
let bytes = v.to_ne_bytes(); let bytes = v.to_ne_bytes();
let mut raw_string = "".to_string(); let mut raw_string = "".to_string();
for ch in bytes { for ch in bytes {
raw_string.push_str(&format!("{:08b} ", ch)); raw_string.push_str(&format!("{ch:08b} "));
} }
Value::string(raw_string.trim(), span) Value::string(raw_string.trim(), span)
} else if let Some(v) = num.to_i16() { } else if let Some(v) = num.to_i16() {
let bytes = v.to_ne_bytes(); let bytes = v.to_ne_bytes();
let mut raw_string = "".to_string(); let mut raw_string = "".to_string();
for ch in bytes { for ch in bytes {
raw_string.push_str(&format!("{:08b} ", ch)); raw_string.push_str(&format!("{ch:08b} "));
} }
Value::string(raw_string.trim(), span) Value::string(raw_string.trim(), span)
} else if let Some(v) = num.to_i32() { } else if let Some(v) = num.to_i32() {
let bytes = v.to_ne_bytes(); let bytes = v.to_ne_bytes();
let mut raw_string = "".to_string(); let mut raw_string = "".to_string();
for ch in bytes { for ch in bytes {
raw_string.push_str(&format!("{:08b} ", ch)); raw_string.push_str(&format!("{ch:08b} "));
} }
Value::string(raw_string.trim(), span) Value::string(raw_string.trim(), span)
} else { } else {
let bytes = num.to_ne_bytes(); let bytes = num.to_ne_bytes();
let mut raw_string = "".to_string(); let mut raw_string = "".to_string();
for ch in bytes { for ch in bytes {
raw_string.push_str(&format!("{:08b} ", ch)); raw_string.push_str(&format!("{ch:08b} "));
} }
Value::string(raw_string.trim(), span) Value::string(raw_string.trim(), span)
} }
@ -193,7 +193,7 @@ fn action(input: &Value, _args: &Arguments, span: Span) -> Value {
Value::Binary { val, .. } => { Value::Binary { val, .. } => {
let mut raw_string = "".to_string(); let mut raw_string = "".to_string();
for ch in val { for ch in val {
raw_string.push_str(&format!("{:08b} ", ch)); raw_string.push_str(&format!("{ch:08b} "));
} }
Value::string(raw_string.trim(), span) Value::string(raw_string.trim(), span)
} }
@ -204,7 +204,7 @@ fn action(input: &Value, _args: &Arguments, span: Span) -> Value {
let raw_bytes = val.as_bytes(); let raw_bytes = val.as_bytes();
let mut raw_string = "".to_string(); let mut raw_string = "".to_string();
for ch in raw_bytes { for ch in raw_bytes {
raw_string.push_str(&format!("{:08b} ", ch)); raw_string.push_str(&format!("{ch:08b} "));
} }
Value::string(raw_string.trim(), span) Value::string(raw_string.trim(), span)
} }

View File

@ -16,6 +16,11 @@ impl Command for FormatNumber {
fn signature(&self) -> nu_protocol::Signature { fn signature(&self) -> nu_protocol::Signature {
Signature::build("format number") Signature::build("format number")
.input_output_types(vec![(Type::Number, Type::record())]) .input_output_types(vec![(Type::Number, Type::record())])
.switch(
"no-prefix",
"don't include the binary, hex or octal prefixes",
Some('n'),
)
.category(Category::Conversions) .category(Category::Conversions)
} }
@ -24,20 +29,36 @@ impl Command for FormatNumber {
} }
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
vec![Example { vec![
Example {
description: "Get a record containing multiple formats for the number 42", description: "Get a record containing multiple formats for the number 42",
example: "42 | format number", example: "42 | format number",
result: Some(Value::test_record(record! { result: Some(Value::test_record(record! {
"binary" => Value::test_string("0b101010"),
"debug" => Value::test_string("42"), "debug" => Value::test_string("42"),
"display" => Value::test_string("42"), "display" => Value::test_string("42"),
"binary" => Value::test_string("0b101010"),
"lowerexp" => Value::test_string("4.2e1"), "lowerexp" => Value::test_string("4.2e1"),
"lowerhex" => Value::test_string("0x2a"),
"octal" => Value::test_string("0o52"),
"upperexp" => Value::test_string("4.2E1"), "upperexp" => Value::test_string("4.2E1"),
"lowerhex" => Value::test_string("0x2a"),
"upperhex" => Value::test_string("0x2A"), "upperhex" => Value::test_string("0x2A"),
"octal" => Value::test_string("0o52"),
})), })),
}] },
Example {
description: "Format float without prefixes",
example: "3.14 | format number --no-prefix",
result: Some(Value::test_record(record! {
"debug" => Value::test_string("3.14"),
"display" => Value::test_string("3.14"),
"binary" => Value::test_string("100000000001001000111101011100001010001111010111000010100011111"),
"lowerexp" => Value::test_string("3.14e0"),
"upperexp" => Value::test_string("3.14E0"),
"lowerhex" => Value::test_string("40091eb851eb851f"),
"upperhex" => Value::test_string("40091EB851EB851F"),
"octal" => Value::test_string("400110753412172702437"),
})),
},
]
} }
fn run( fn run(
@ -59,14 +80,24 @@ pub(crate) fn format_number(
) -> Result<PipelineData, ShellError> { ) -> Result<PipelineData, ShellError> {
let cell_paths: Vec<CellPath> = call.rest(engine_state, stack, 0)?; let cell_paths: Vec<CellPath> = call.rest(engine_state, stack, 0)?;
let args = CellPathOnlyArgs::from(cell_paths); let args = CellPathOnlyArgs::from(cell_paths);
if call.has_flag(engine_state, stack, "no-prefix")? {
operate(
action_no_prefix,
args,
input,
call.head,
engine_state.signals(),
)
} else {
operate(action, args, input, call.head, engine_state.signals()) operate(action, args, input, call.head, engine_state.signals())
} }
}
fn action(input: &Value, _args: &CellPathOnlyArgs, span: Span) -> Value { fn action(input: &Value, _args: &CellPathOnlyArgs, span: Span) -> Value {
match input { match input {
Value::Float { val, .. } => format_f64(*val, span), Value::Float { val, .. } => format_f64(*val, false, span),
Value::Int { val, .. } => format_i64(*val, span), Value::Int { val, .. } => format_i64(*val, false, span),
Value::Filesize { val, .. } => format_i64(val.get(), span), Value::Filesize { val, .. } => format_i64(val.get(), false, span),
// Propagate errors by explicitly matching them before the final case. // Propagate errors by explicitly matching them before the final case.
Value::Error { .. } => input.clone(), Value::Error { .. } => input.clone(),
other => Value::error( other => Value::error(
@ -81,33 +112,80 @@ fn action(input: &Value, _args: &CellPathOnlyArgs, span: Span) -> Value {
} }
} }
fn format_i64(num: i64, span: Span) -> Value { fn action_no_prefix(input: &Value, _args: &CellPathOnlyArgs, span: Span) -> Value {
match input {
Value::Float { val, .. } => format_f64(*val, true, span),
Value::Int { val, .. } => format_i64(*val, true, span),
Value::Filesize { val, .. } => format_i64(val.get(), true, span),
// Propagate errors by explicitly matching them before the final case.
Value::Error { .. } => input.clone(),
other => Value::error(
ShellError::OnlySupportsThisInputType {
exp_input_type: "float, int, or filesize".into(),
wrong_type: other.get_type().to_string(),
dst_span: span,
src_span: other.span(),
},
span,
),
}
}
fn format_i64(num: i64, no_prefix: bool, span: Span) -> Value {
Value::record( Value::record(
record! { record! {
"binary" => Value::string(format!("{num:#b}"), span),
"debug" => Value::string(format!("{num:#?}"), span), "debug" => Value::string(format!("{num:#?}"), span),
"display" => Value::string(format!("{num}"), span), "display" => Value::string(format!("{num}"), span),
"binary" => Value::string(
if no_prefix { format!("{num:b}") } else { format!("{num:#b}") },
span,
),
"lowerexp" => Value::string(format!("{num:#e}"), span), "lowerexp" => Value::string(format!("{num:#e}"), span),
"lowerhex" => Value::string(format!("{num:#x}"), span),
"octal" => Value::string(format!("{num:#o}"), span),
"upperexp" => Value::string(format!("{num:#E}"), span), "upperexp" => Value::string(format!("{num:#E}"), span),
"upperhex" => Value::string(format!("{num:#X}"), span), "lowerhex" => Value::string(
if no_prefix { format!("{num:x}") } else { format!("{num:#x}") },
span,
),
"upperhex" => Value::string(
if no_prefix { format!("{num:X}") } else { format!("{num:#X}") },
span,
),
"octal" => Value::string(
if no_prefix { format!("{num:o}") } else { format!("{num:#o}") },
span,
)
}, },
span, span,
) )
} }
fn format_f64(num: f64, span: Span) -> Value { fn format_f64(num: f64, no_prefix: bool, span: Span) -> Value {
Value::record( Value::record(
record! { record! {
"binary" => Value::string(format!("{:b}", num.to_bits()), span),
"debug" => Value::string(format!("{num:#?}"), span), "debug" => Value::string(format!("{num:#?}"), span),
"display" => Value::string(format!("{num}"), span), "display" => Value::string(format!("{num}"), span),
"binary" => Value::string(
if no_prefix {
format!("{:b}", num.to_bits())
} else {
format!("{:#b}", num.to_bits())
},
span,
),
"lowerexp" => Value::string(format!("{num:#e}"), span), "lowerexp" => Value::string(format!("{num:#e}"), span),
"lowerhex" => Value::string(format!("{:0x}", num.to_bits()), span),
"octal" => Value::string(format!("{:0o}", num.to_bits()), span),
"upperexp" => Value::string(format!("{num:#E}"), span), "upperexp" => Value::string(format!("{num:#E}"), span),
"upperhex" => Value::string(format!("{:0X}", num.to_bits()), span), "lowerhex" => Value::string(
if no_prefix { format!("{:x}", num.to_bits()) } else { format!("{:#x}", num.to_bits()) },
span,
),
"upperhex" => Value::string(
if no_prefix { format!("{:X}", num.to_bits()) } else { format!("{:#X}", num.to_bits()) },
span,
),
"octal" => Value::string(
if no_prefix { format!("{:o}", num.to_bits()) } else { format!("{:#o}", num.to_bits()) },
span,
)
}, },
span, span,
) )

View File

@ -6,7 +6,7 @@ repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cmd-lang"
edition = "2024" edition = "2024"
license = "MIT" license = "MIT"
name = "nu-cmd-lang" name = "nu-cmd-lang"
version = "0.105.1" version = "0.105.2"
[lib] [lib]
bench = false bench = false
@ -15,17 +15,17 @@ bench = false
workspace = true workspace = true
[dependencies] [dependencies]
nu-engine = { path = "../nu-engine", version = "0.105.1", default-features = false } nu-engine = { path = "../nu-engine", version = "0.105.2", default-features = false }
nu-parser = { path = "../nu-parser", version = "0.105.1" } nu-parser = { path = "../nu-parser", version = "0.105.2" }
nu-protocol = { path = "../nu-protocol", version = "0.105.1", default-features = false } nu-protocol = { path = "../nu-protocol", version = "0.105.2", default-features = false }
nu-utils = { path = "../nu-utils", version = "0.105.1", default-features = false } nu-utils = { path = "../nu-utils", version = "0.105.2", default-features = false }
nu-cmd-base = { path = "../nu-cmd-base", version = "0.105.1" } nu-cmd-base = { path = "../nu-cmd-base", version = "0.105.2" }
itertools = { workspace = true } itertools = { workspace = true }
shadow-rs = { version = "1.1", default-features = false } shadow-rs = { version = "1.2", default-features = false }
[build-dependencies] [build-dependencies]
shadow-rs = { version = "1.1", default-features = false, features = ["build"] } shadow-rs = { version = "1.2", default-features = false, features = ["build"] }
[dev-dependencies] [dev-dependencies]
quickcheck = { workspace = true } quickcheck = { workspace = true }
@ -43,8 +43,3 @@ plugin = [
"nu-protocol/plugin", "nu-protocol/plugin",
"os", "os",
] ]
trash-support = []
sqlite = []
static-link-openssl = []
system-clipboard = []

View File

@ -296,7 +296,7 @@ fn run(
} else { } else {
let value = stream.into_value(); let value = stream.into_value();
let base_description = value.get_type().to_string(); let base_description = value.get_type().to_string();
Value::string(format!("{} (stream)", base_description), head) Value::string(format!("{base_description} (stream)"), head)
} }
} }
PipelineData::Value(value, ..) => { PipelineData::Value(value, ..) => {

View File

@ -229,7 +229,7 @@ fn make_other_error(value: &Value, throw_span: Option<Span>) -> ShellError {
error: "invalid error format.".into(), error: "invalid error format.".into(),
msg: "`$.label.start` should be smaller than `$.label.end`".into(), msg: "`$.label.start` should be smaller than `$.label.end`".into(),
span: Some(label_span), span: Some(label_span),
help: Some(format!("{} > {}", span_start, span_end)), help: Some(format!("{span_start} > {span_end}")),
inner: vec![], inner: vec![],
}; };
} }

View File

@ -69,5 +69,5 @@ pub use return_::Return;
pub use scope::*; pub use scope::*;
pub use try_::Try; pub use try_::Try;
pub use use_::Use; pub use use_::Use;
pub use version::Version; pub use version::{VERSION_NU_FEATURES, Version};
pub use while_::While; pub use while_::While;

View File

@ -86,12 +86,16 @@ impl Command for OverlayHide {
vec![] vec![]
}; };
// also restore env vars which has been hidden
let env_vars_to_restore = stack.get_hidden_env_vars(&overlay_name.item, engine_state);
stack.remove_overlay(&overlay_name.item); stack.remove_overlay(&overlay_name.item);
for (name, val) in env_vars_to_restore {
stack.add_env_var(name, val);
}
for (name, val) in env_vars_to_keep { for (name, val) in env_vars_to_keep {
stack.add_env_var(name, val); stack.add_env_var(name, val);
} }
Ok(PipelineData::empty()) Ok(PipelineData::empty())
} }

View File

@ -1,11 +1,48 @@
use std::sync::OnceLock; use std::{borrow::Cow, sync::OnceLock};
use itertools::Itertools;
use nu_engine::command_prelude::*; use nu_engine::command_prelude::*;
use nu_protocol::engine::StateWorkingSet; use nu_protocol::engine::StateWorkingSet;
use shadow_rs::shadow; use shadow_rs::shadow;
shadow!(build); shadow!(build);
/// Static container for the cargo features used by the `version` command.
///
/// This `OnceLock` holds the features from `nu`.
/// When you build `nu_cmd_lang`, Cargo doesn't pass along the same features that `nu` itself uses.
/// By setting this static before calling `version`, you make it show `nu`'s features instead
/// of `nu_cmd_lang`'s.
///
/// Embedders can set this to any feature list they need, but in most cases you'll probably want to
/// pass the cargo features of your host binary.
///
/// # How to get cargo features in your build script
///
/// In your binary's build script:
/// ```rust,ignore
/// // Re-export CARGO_CFG_FEATURE to the main binary.
/// // It holds all the features that cargo sets for your binary as a comma-separated list.
/// println!(
/// "cargo:rustc-env=NU_FEATURES={}",
/// std::env::var("CARGO_CFG_FEATURE").expect("set by cargo")
/// );
/// ```
///
/// Then, before you call `version`:
/// ```rust,ignore
/// // This uses static strings, but since we're using `Cow`, you can also pass owned strings.
/// let features = env!("NU_FEATURES")
/// .split(',')
/// .map(Cow::Borrowed)
/// .collect();
///
/// nu_cmd_lang::VERSION_NU_FEATURES
/// .set(features)
/// .expect("couldn't set VERSION_NU_FEATURES");
/// ```
pub static VERSION_NU_FEATURES: OnceLock<Vec<Cow<'static, str>>> = OnceLock::new();
#[derive(Clone)] #[derive(Clone)]
pub struct Version; pub struct Version;
@ -113,7 +150,17 @@ pub fn version(engine_state: &EngineState, span: Span) -> Result<PipelineData, S
record.push( record.push(
"features", "features",
Value::string(features_enabled().join(", "), span), Value::string(
VERSION_NU_FEATURES
.get()
.as_ref()
.map(|v| v.as_slice())
.unwrap_or_default()
.iter()
.filter(|f| !f.starts_with("dep:"))
.join(", "),
span,
),
); );
#[cfg(not(feature = "plugin"))] #[cfg(not(feature = "plugin"))]
@ -164,42 +211,12 @@ fn global_allocator() -> &'static str {
"standard" "standard"
} }
fn features_enabled() -> Vec<String> {
let mut names = vec!["default".to_string()];
// NOTE: There should be another way to know features on.
#[cfg(feature = "trash-support")]
{
names.push("trash".to_string());
}
#[cfg(feature = "sqlite")]
{
names.push("sqlite".to_string());
}
#[cfg(feature = "static-link-openssl")]
{
names.push("static-link-openssl".to_string());
}
#[cfg(feature = "system-clipboard")]
{
names.push("system-clipboard".to_string());
}
names.sort();
names
}
#[cfg(test)] #[cfg(test)]
mod test { mod test {
#[test] #[test]
fn test_examples() { fn test_examples() {
use super::Version; use super::Version;
use crate::test_examples; use crate::test_examples;
test_examples(Version {}) test_examples(Version)
} }
} }

View File

@ -221,23 +221,23 @@ impl std::fmt::Debug for DebuggableValue<'_> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self.0 { match self.0 {
Value::Bool { val, .. } => { Value::Bool { val, .. } => {
write!(f, "{:?}", val) write!(f, "{val:?}")
} }
Value::Int { val, .. } => { Value::Int { val, .. } => {
write!(f, "{:?}", val) write!(f, "{val:?}")
} }
Value::Float { val, .. } => { Value::Float { val, .. } => {
write!(f, "{:?}f", val) write!(f, "{val:?}f")
} }
Value::Filesize { val, .. } => { Value::Filesize { val, .. } => {
write!(f, "Filesize({:?})", val) write!(f, "Filesize({val:?})")
} }
Value::Duration { val, .. } => { Value::Duration { val, .. } => {
let duration = std::time::Duration::from_nanos(*val as u64); let duration = std::time::Duration::from_nanos(*val as u64);
write!(f, "Duration({:?})", duration) write!(f, "Duration({duration:?})")
} }
Value::Date { val, .. } => { Value::Date { val, .. } => {
write!(f, "Date({:?})", val) write!(f, "Date({val:?})")
} }
Value::Range { val, .. } => match **val { Value::Range { val, .. } => match **val {
Range::IntRange(range) => match range.end() { Range::IntRange(range) => match range.end() {
@ -280,7 +280,7 @@ impl std::fmt::Debug for DebuggableValue<'_> {
}, },
}, },
Value::String { val, .. } | Value::Glob { val, .. } => { Value::String { val, .. } | Value::Glob { val, .. } => {
write!(f, "{:?}", val) write!(f, "{val:?}")
} }
Value::Record { val, .. } => { Value::Record { val, .. } => {
write!(f, "{{")?; write!(f, "{{")?;
@ -305,22 +305,22 @@ impl std::fmt::Debug for DebuggableValue<'_> {
write!(f, "]") write!(f, "]")
} }
Value::Closure { val, .. } => { Value::Closure { val, .. } => {
write!(f, "Closure({:?})", val) write!(f, "Closure({val:?})")
} }
Value::Nothing { .. } => { Value::Nothing { .. } => {
write!(f, "Nothing") write!(f, "Nothing")
} }
Value::Error { error, .. } => { Value::Error { error, .. } => {
write!(f, "Error({:?})", error) write!(f, "Error({error:?})")
} }
Value::Binary { val, .. } => { Value::Binary { val, .. } => {
write!(f, "Binary({:?})", val) write!(f, "Binary({val:?})")
} }
Value::CellPath { val, .. } => { Value::CellPath { val, .. } => {
write!(f, "CellPath({:?})", val.to_string()) write!(f, "CellPath({:?})", val.to_string())
} }
Value::Custom { val, .. } => { Value::Custom { val, .. } => {
write!(f, "CustomValue({:?})", val) write!(f, "CustomValue({val:?})")
} }
} }
} }

View File

@ -5,7 +5,7 @@ edition = "2024"
license = "MIT" license = "MIT"
name = "nu-cmd-plugin" name = "nu-cmd-plugin"
repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cmd-plugin" repository = "https://github.com/nushell/nushell/tree/main/crates/nu-cmd-plugin"
version = "0.105.1" version = "0.105.2"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -13,10 +13,10 @@ version = "0.105.1"
workspace = true workspace = true
[dependencies] [dependencies]
nu-engine = { path = "../nu-engine", version = "0.105.1" } nu-engine = { path = "../nu-engine", version = "0.105.2" }
nu-path = { path = "../nu-path", version = "0.105.1" } nu-path = { path = "../nu-path", version = "0.105.2" }
nu-protocol = { path = "../nu-protocol", version = "0.105.1", features = ["plugin"] } nu-protocol = { path = "../nu-protocol", version = "0.105.2", features = ["plugin"] }
nu-plugin-engine = { path = "../nu-plugin-engine", version = "0.105.1" } nu-plugin-engine = { path = "../nu-plugin-engine", version = "0.105.2" }
itertools = { workspace = true } itertools = { workspace = true }

View File

@ -5,7 +5,7 @@ repository = "https://github.com/nushell/nushell/tree/main/crates/nu-color-confi
edition = "2024" edition = "2024"
license = "MIT" license = "MIT"
name = "nu-color-config" name = "nu-color-config"
version = "0.105.1" version = "0.105.2"
[lib] [lib]
bench = false bench = false
@ -14,12 +14,12 @@ bench = false
workspace = true workspace = true
[dependencies] [dependencies]
nu-protocol = { path = "../nu-protocol", version = "0.105.1", default-features = false } nu-protocol = { path = "../nu-protocol", version = "0.105.2", default-features = false }
nu-engine = { path = "../nu-engine", version = "0.105.1", default-features = false } nu-engine = { path = "../nu-engine", version = "0.105.2", default-features = false }
nu-json = { path = "../nu-json", version = "0.105.1" } nu-json = { path = "../nu-json", version = "0.105.2" }
nu-ansi-term = { workspace = true } nu-ansi-term = { workspace = true }
serde = { workspace = true, features = ["derive"] } serde = { workspace = true, features = ["derive"] }
[dev-dependencies] [dev-dependencies]
nu-test-support = { path = "../nu-test-support", version = "0.105.1" } nu-test-support = { path = "../nu-test-support", version = "0.105.2" }

View File

@ -5,7 +5,7 @@ edition = "2024"
license = "MIT" license = "MIT"
name = "nu-command" name = "nu-command"
repository = "https://github.com/nushell/nushell/tree/main/crates/nu-command" repository = "https://github.com/nushell/nushell/tree/main/crates/nu-command"
version = "0.105.1" version = "0.105.2"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -16,21 +16,21 @@ bench = false
workspace = true workspace = true
[dependencies] [dependencies]
nu-cmd-base = { path = "../nu-cmd-base", version = "0.105.1" } nu-cmd-base = { path = "../nu-cmd-base", version = "0.105.2" }
nu-color-config = { path = "../nu-color-config", version = "0.105.1" } nu-color-config = { path = "../nu-color-config", version = "0.105.2" }
nu-engine = { path = "../nu-engine", version = "0.105.1", default-features = false } nu-engine = { path = "../nu-engine", version = "0.105.2", default-features = false }
nu-glob = { path = "../nu-glob", version = "0.105.1" } nu-glob = { path = "../nu-glob", version = "0.105.2" }
nu-json = { path = "../nu-json", version = "0.105.1" } nu-json = { path = "../nu-json", version = "0.105.2" }
nu-parser = { path = "../nu-parser", version = "0.105.1" } nu-parser = { path = "../nu-parser", version = "0.105.2" }
nu-path = { path = "../nu-path", version = "0.105.1" } nu-path = { path = "../nu-path", version = "0.105.2" }
nu-pretty-hex = { path = "../nu-pretty-hex", version = "0.105.1" } nu-pretty-hex = { path = "../nu-pretty-hex", version = "0.105.2" }
nu-protocol = { path = "../nu-protocol", version = "0.105.1", default-features = false } nu-protocol = { path = "../nu-protocol", version = "0.105.2", default-features = false }
nu-system = { path = "../nu-system", version = "0.105.1" } nu-system = { path = "../nu-system", version = "0.105.2" }
nu-table = { path = "../nu-table", version = "0.105.1" } nu-table = { path = "../nu-table", version = "0.105.2" }
nu-term-grid = { path = "../nu-term-grid", version = "0.105.1" } nu-term-grid = { path = "../nu-term-grid", version = "0.105.2" }
nu-utils = { path = "../nu-utils", version = "0.105.1", default-features = false } nu-utils = { path = "../nu-utils", version = "0.105.2", default-features = false }
nu-ansi-term = { workspace = true } nu-ansi-term = { workspace = true }
nuon = { path = "../nuon", version = "0.105.1" } nuon = { path = "../nuon", version = "0.105.2" }
alphanumeric-sort = { workspace = true } alphanumeric-sort = { workspace = true }
base64 = { workspace = true } base64 = { workspace = true }
@ -226,8 +226,8 @@ sqlite = ["rusqlite"]
trash-support = ["trash"] trash-support = ["trash"]
[dev-dependencies] [dev-dependencies]
nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.105.1" } nu-cmd-lang = { path = "../nu-cmd-lang", version = "0.105.2" }
nu-test-support = { path = "../nu-test-support", version = "0.105.1" } nu-test-support = { path = "../nu-test-support", version = "0.105.2" }
dirs = { workspace = true } dirs = { workspace = true }
mockito = { workspace = true, default-features = false } mockito = { workspace = true, default-features = false }

View File

@ -89,7 +89,7 @@ impl Command for Histogram {
"frequency-column-name can't be {}", "frequency-column-name can't be {}",
forbidden_column_names forbidden_column_names
.iter() .iter()
.map(|val| format!("'{}'", val)) .map(|val| format!("'{val}'"))
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(", ") .join(", ")
), ),

View File

@ -142,7 +142,7 @@ fn into_binary(
} }
} }
fn action(input: &Value, _args: &Arguments, span: Span) -> Value { fn action(input: &Value, args: &Arguments, span: Span) -> Value {
let value = match input { let value = match input {
Value::Binary { .. } => input.clone(), Value::Binary { .. } => input.clone(),
Value::Int { val, .. } => Value::binary(val.to_ne_bytes().to_vec(), span), Value::Int { val, .. } => Value::binary(val.to_ne_bytes().to_vec(), span),
@ -168,7 +168,7 @@ fn action(input: &Value, _args: &Arguments, span: Span) -> Value {
), ),
}; };
if _args.compact { if args.compact {
let val_span = value.span(); let val_span = value.span();
if let Value::Binary { val, .. } = value { if let Value::Binary { val, .. } = value {
let val = if cfg!(target_endian = "little") { let val = if cfg!(target_endian = "little") {

View File

@ -678,7 +678,7 @@ fn parse_value_from_record_as_u32(
Value::Int { val, .. } => { Value::Int { val, .. } => {
if *val < 0 || *val > u32::MAX as i64 { if *val < 0 || *val > u32::MAX as i64 {
return Err(ShellError::IncorrectValue { return Err(ShellError::IncorrectValue {
msg: format!("incorrect value for {}", col), msg: format!("incorrect value for {col}"),
val_span: *head, val_span: *head,
call_span: *span, call_span: *span,
}); });

View File

@ -368,8 +368,7 @@ fn merge_record(record: &Record, head: Span, span: Span) -> Result<Value, ShellE
if !ALLOWED_SIGNS.contains(&val.as_str()) { if !ALLOWED_SIGNS.contains(&val.as_str()) {
let allowed_signs = ALLOWED_SIGNS.join(", "); let allowed_signs = ALLOWED_SIGNS.join(", ");
return Err(ShellError::IncorrectValue { return Err(ShellError::IncorrectValue {
msg: format!("Invalid sign. Allowed signs are {}", allowed_signs) msg: format!("Invalid sign. Allowed signs are {allowed_signs}").to_string(),
.to_string(),
val_span: sign.span(), val_span: sign.span(),
call_span: head, call_span: head,
}); });

View File

@ -122,8 +122,8 @@ impl Table {
.conn .conn
.query_row(&table_exists_query, [], |row| row.get(0)) .query_row(&table_exists_query, [], |row| row.get(0))
.map_err(|err| ShellError::GenericError { .map_err(|err| ShellError::GenericError {
error: format!("{:#?}", err), error: format!("{err:#?}"),
msg: format!("{:#?}", err), msg: format!("{err:#?}"),
span: None, span: None,
help: None, help: None,
inner: Vec::new(), inner: Vec::new(),
@ -257,7 +257,7 @@ fn insert_in_transaction(
let insert_statement = format!( let insert_statement = format!(
"INSERT INTO [{}] ({}) VALUES ({})", "INSERT INTO [{}] ({}) VALUES ({})",
table_name, table_name,
Itertools::intersperse(val.columns().map(|c| format!("`{}`", c)), ", ".to_string()) Itertools::intersperse(val.columns().map(|c| format!("`{c}`")), ", ".to_string())
.collect::<String>(), .collect::<String>(),
Itertools::intersperse(itertools::repeat_n("?", val.len()), ", ").collect::<String>(), Itertools::intersperse(itertools::repeat_n("?", val.len()), ", ").collect::<String>(),
); );
@ -381,7 +381,7 @@ fn get_columns_with_sqlite_types(
.map(|name| (format!("`{}`", name.0), name.1)) .map(|name| (format!("`{}`", name.0), name.1))
.any(|(name, _)| name == *c) .any(|(name, _)| name == *c)
{ {
columns.push((format!("`{}`", c), nu_value_to_sqlite_type(v)?)); columns.push((format!("`{c}`"), nu_value_to_sqlite_type(v)?));
} }
} }

View File

@ -112,16 +112,31 @@ impl SQLiteDatabase {
if self.path == PathBuf::from(MEMORY_DB) { if self.path == PathBuf::from(MEMORY_DB) {
open_connection_in_memory_custom() open_connection_in_memory_custom()
} else { } else {
Connection::open(&self.path).map_err(|e| ShellError::GenericError { let conn = Connection::open(&self.path).map_err(|e| ShellError::GenericError {
error: "Failed to open SQLite database from open_connection".into(), error: "Failed to open SQLite database from open_connection".into(),
msg: e.to_string(), msg: e.to_string(),
span: None, span: None,
help: None, help: None,
inner: vec![], inner: vec![],
}) })?;
conn.busy_handler(Some(SQLiteDatabase::sleeper))
.map_err(|e| ShellError::GenericError {
error: "Failed to set busy handler for SQLite database".into(),
msg: e.to_string(),
span: None,
help: None,
inner: vec![],
})?;
Ok(conn)
} }
} }
fn sleeper(attempts: i32) -> bool {
log::warn!("SQLITE_BUSY, retrying after 250ms (attempt {})", attempts);
std::thread::sleep(std::time::Duration::from_millis(250));
true
}
pub fn get_tables(&self, conn: &Connection) -> Result<Vec<DbTable>, SqliteError> { pub fn get_tables(&self, conn: &Connection) -> Result<Vec<DbTable>, SqliteError> {
let mut table_names = let mut table_names =
conn.prepare("SELECT name FROM sqlite_master WHERE type = 'table'")?; conn.prepare("SELECT name FROM sqlite_master WHERE type = 'table'")?;
@ -158,7 +173,7 @@ impl SQLiteDatabase {
filename: String, filename: String,
) -> Result<(), SqliteError> { ) -> Result<(), SqliteError> {
//vacuum main into 'c:\\temp\\foo.db' //vacuum main into 'c:\\temp\\foo.db'
conn.execute(&format!("vacuum main into '{}'", filename), [])?; conn.execute(&format!("vacuum main into '{filename}'"), [])?;
Ok(()) Ok(())
} }
@ -668,13 +683,23 @@ pub fn convert_sqlite_value_to_nu_value(value: ValueRef, span: Span) -> Value {
pub fn open_connection_in_memory_custom() -> Result<Connection, ShellError> { pub fn open_connection_in_memory_custom() -> Result<Connection, ShellError> {
let flags = OpenFlags::default(); let flags = OpenFlags::default();
let conn =
Connection::open_with_flags(MEMORY_DB, flags).map_err(|e| ShellError::GenericError { Connection::open_with_flags(MEMORY_DB, flags).map_err(|e| ShellError::GenericError {
error: "Failed to open SQLite custom connection in memory".into(), error: "Failed to open SQLite custom connection in memory".into(),
msg: e.to_string(), msg: e.to_string(),
span: Some(Span::test_data()), span: Some(Span::test_data()),
help: None, help: None,
inner: vec![], inner: vec![],
}) })?;
conn.busy_handler(Some(SQLiteDatabase::sleeper))
.map_err(|e| ShellError::GenericError {
error: "Failed to set busy handler for SQLite custom connection in memory".into(),
msg: e.to_string(),
span: Some(Span::test_data()),
help: None,
inner: vec![],
})?;
Ok(conn)
} }
pub fn open_connection_in_memory() -> Result<Connection, ShellError> { pub fn open_connection_in_memory() -> Result<Connection, ShellError> {

View File

@ -87,10 +87,9 @@ impl Command for Metadata {
.into_pipeline_data(), .into_pipeline_data(),
) )
} }
None => Ok( None => {
Value::record(build_metadata_record(input.metadata().as_ref(), head), head) Ok(Value::record(build_metadata_record(&input, head), head).into_pipeline_data())
.into_pipeline_data(), }
),
} }
} }
@ -116,19 +115,7 @@ fn build_metadata_record_value(
head: Span, head: Span,
) -> Value { ) -> Value {
let mut record = Record::new(); let mut record = Record::new();
record.push("span", arg.span().into_value(head));
let span = arg.span();
record.push(
"span",
Value::record(
record! {
"start" => Value::int(span.start as i64,span),
"end" => Value::int(span.end as i64, span),
},
head,
),
);
Value::record(extend_record_with_metadata(record, metadata, head), head) Value::record(extend_record_with_metadata(record, metadata, head), head)
} }

View File

@ -42,10 +42,7 @@ impl Command for MetadataAccess {
// `ClosureEvalOnce` is not used as it uses `Stack::captures_to_stack` rather than // `ClosureEvalOnce` is not used as it uses `Stack::captures_to_stack` rather than
// `Stack::captures_to_stack_preserve_out_dest`. This command shouldn't collect streams // `Stack::captures_to_stack_preserve_out_dest`. This command shouldn't collect streams
let mut callee_stack = caller_stack.captures_to_stack_preserve_out_dest(closure.captures); let mut callee_stack = caller_stack.captures_to_stack_preserve_out_dest(closure.captures);
let metadata_record = Value::record( let metadata_record = Value::record(build_metadata_record(&input, call.head), call.head);
build_metadata_record(input.metadata().as_ref(), call.head),
call.head,
);
if let Some(var_id) = block.signature.get_positional(0).and_then(|var| var.var_id) { if let Some(var_id) = block.signature.get_positional(0).and_then(|var| var.var_id) {
callee_stack.add_var(var_id, metadata_record) callee_stack.add_var(var_id, metadata_record)
@ -58,12 +55,10 @@ impl Command for MetadataAccess {
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
vec![Example { vec![Example {
description: "Access metadata and data from a stream together", description: "Access metadata and data from a stream together",
example: r#"{foo: bar} | to json --raw | metadata access {|meta| {in: $in, meta: $meta}}"#, example: r#"{foo: bar} | to json --raw | metadata access {|meta| {in: $in, content: $meta.content_type}}"#,
result: Some(Value::test_record(record! { result: Some(Value::test_record(record! {
"in" => Value::test_string(r#"{"foo":"bar"}"#), "in" => Value::test_string(r#"{"foo":"bar"}"#),
"meta" => Value::test_record(record! { "content" => Value::test_string(r#"application/json"#)
"content_type" => Value::test_string(r#"application/json"#)
})
})), })),
}] }]
} }

View File

@ -79,15 +79,13 @@ impl Command for MetadataSet {
}, },
Example { Example {
description: "Set the metadata of a file path", description: "Set the metadata of a file path",
example: "'crates' | metadata set --datasource-filepath $'(pwd)/crates' | metadata", example: "'crates' | metadata set --datasource-filepath $'(pwd)/crates'",
result: None, result: None,
}, },
Example { Example {
description: "Set the metadata of a file path", description: "Set the metadata of a file path",
example: "'crates' | metadata set --content-type text/plain | metadata", example: "'crates' | metadata set --content-type text/plain | metadata | get content_type",
result: Some(Value::test_record(record! { result: Some(Value::test_string("text/plain")),
"content_type" => Value::test_string("text/plain"),
})),
}, },
] ]
} }

View File

@ -1,4 +1,4 @@
use nu_protocol::{DataSource, PipelineMetadata, Record, Span, Value}; use nu_protocol::{DataSource, IntoValue, PipelineData, PipelineMetadata, Record, Span, Value};
pub fn extend_record_with_metadata( pub fn extend_record_with_metadata(
mut record: Record, mut record: Record,
@ -29,6 +29,10 @@ pub fn extend_record_with_metadata(
record record
} }
pub fn build_metadata_record(metadata: Option<&PipelineMetadata>, head: Span) -> Record { pub fn build_metadata_record(pipeline: &PipelineData, head: Span) -> Record {
extend_record_with_metadata(Record::new(), metadata, head) let mut record = Record::new();
if let Some(span) = pipeline.span() {
record.insert("span", span.into_value(head));
}
extend_record_with_metadata(record, pipeline.metadata().as_ref(), head)
} }

View File

@ -126,10 +126,10 @@ impl Command for ViewSource {
} }
let _ = write!(&mut final_contents, "--{}", n.long); let _ = write!(&mut final_contents, "--{}", n.long);
if let Some(short) = n.short { if let Some(short) = n.short {
let _ = write!(&mut final_contents, "(-{})", short); let _ = write!(&mut final_contents, "(-{short})");
} }
if let Some(arg) = &n.arg { if let Some(arg) = &n.arg {
let _ = write!(&mut final_contents, ": {}", arg); let _ = write!(&mut final_contents, ": {arg}");
} }
final_contents.push(' '); final_contents.push(' ');
} }
@ -146,7 +146,7 @@ impl Command for ViewSource {
let mut c = 0; let mut c = 0;
for (insig, outsig) in type_signatures { for (insig, outsig) in type_signatures {
c += 1; c += 1;
let s = format!("{} -> {}", insig, outsig); let s = format!("{insig} -> {outsig}");
final_contents.push_str(&s); final_contents.push_str(&s);
if c != len { if c != len {
final_contents.push_str(", ") final_contents.push_str(", ")

View File

@ -188,6 +188,9 @@ pub fn add_shell_command_context(mut engine_state: EngineState) -> EngineState {
// Strings // Strings
bind_command! { bind_command! {
Ansi,
AnsiLink,
AnsiStrip,
Char, Char,
Decode, Decode,
Encode, Encode,
@ -250,9 +253,6 @@ pub fn add_shell_command_context(mut engine_state: EngineState) -> EngineState {
// Platform // Platform
#[cfg(feature = "os")] #[cfg(feature = "os")]
bind_command! { bind_command! {
Ansi,
AnsiLink,
AnsiStrip,
Clear, Clear,
Du, Du,
Input, Input,

View File

@ -112,8 +112,8 @@ impl Command for Mktemp {
.map_err(|_| ShellError::NonUtf8 { span })?, .map_err(|_| ShellError::NonUtf8 { span })?,
Err(e) => { Err(e) => {
return Err(ShellError::GenericError { return Err(ShellError::GenericError {
error: format!("{}", e), error: format!("{e}"),
msg: format!("{}", e), msg: format!("{e}"),
span: None, span: None,
help: None, help: None,
inner: vec![], inner: vec![],

View File

@ -198,7 +198,7 @@ impl Command for Open {
let converter = exts_opt.and_then(|exts| { let converter = exts_opt.and_then(|exts| {
exts.iter().find_map(|ext| { exts.iter().find_map(|ext| {
engine_state engine_state
.find_decl(format!("from {}", ext).as_bytes(), &[]) .find_decl(format!("from {ext}").as_bytes(), &[])
.map(|id| (id, ext.to_string())) .map(|id| (id, ext.to_string()))
}) })
}); });
@ -314,7 +314,7 @@ fn extract_extensions(filename: &str) -> Vec<String> {
if current_extension.is_empty() { if current_extension.is_empty() {
current_extension.push_str(part); current_extension.push_str(part);
} else { } else {
current_extension = format!("{}.{}", part, current_extension); current_extension = format!("{part}.{current_extension}");
} }
extensions.push(current_extension.clone()); extensions.push(current_extension.clone());
} }

View File

@ -91,7 +91,8 @@ impl Command for Save {
PipelineData::ByteStream(stream, metadata) => { PipelineData::ByteStream(stream, metadata) => {
check_saving_to_source_file(metadata.as_ref(), &path, stderr_path.as_ref())?; check_saving_to_source_file(metadata.as_ref(), &path, stderr_path.as_ref())?;
let (file, stderr_file) = get_files(&path, stderr_path.as_ref(), append, force)?; let (file, stderr_file) =
get_files(engine_state, &path, stderr_path.as_ref(), append, force)?;
let size = stream.known_size(); let size = stream.known_size();
let signals = engine_state.signals(); let signals = engine_state.signals();
@ -201,7 +202,8 @@ impl Command for Save {
stderr_path.as_ref(), stderr_path.as_ref(),
)?; )?;
let (mut file, _) = get_files(&path, stderr_path.as_ref(), append, force)?; let (mut file, _) =
get_files(engine_state, &path, stderr_path.as_ref(), append, force)?;
for val in ls { for val in ls {
file.write_all(&value_to_bytes(val)?) file.write_all(&value_to_bytes(val)?)
.map_err(&from_io_error)?; .map_err(&from_io_error)?;
@ -226,7 +228,8 @@ impl Command for Save {
input_to_bytes(input, Path::new(&path.item), raw, engine_state, stack, span)?; input_to_bytes(input, Path::new(&path.item), raw, engine_state, stack, span)?;
// Only open file after successful conversion // Only open file after successful conversion
let (mut file, _) = get_files(&path, stderr_path.as_ref(), append, force)?; let (mut file, _) =
get_files(engine_state, &path, stderr_path.as_ref(), append, force)?;
file.write_all(&bytes).map_err(&from_io_error)?; file.write_all(&bytes).map_err(&from_io_error)?;
file.flush().map_err(&from_io_error)?; file.flush().map_err(&from_io_error)?;
@ -422,13 +425,14 @@ fn prepare_path(
} }
} }
fn open_file(path: &Path, span: Span, append: bool) -> Result<File, ShellError> { fn open_file(
let file: Result<File, nu_protocol::shell_error::io::ErrorKind> = match (append, path.exists()) engine_state: &EngineState,
{ path: &Path,
(true, true) => std::fs::OpenOptions::new() span: Span,
.append(true) append: bool,
.open(path) ) -> Result<File, ShellError> {
.map_err(|err| err.into()), let file: std::io::Result<File> = match (append, path.exists()) {
(true, true) => std::fs::OpenOptions::new().append(true).open(path),
_ => { _ => {
// This is a temporary solution until `std::fs::File::create` is fixed on Windows (rust-lang/rust#134893) // This is a temporary solution until `std::fs::File::create` is fixed on Windows (rust-lang/rust#134893)
// A TOCTOU problem exists here, which may cause wrong error message to be shown // A TOCTOU problem exists here, which may cause wrong error message to be shown
@ -438,22 +442,51 @@ fn open_file(path: &Path, span: Span, append: bool) -> Result<File, ShellError>
deprecated, deprecated,
reason = "we don't get a IsADirectory error, so we need to provide it" reason = "we don't get a IsADirectory error, so we need to provide it"
)] )]
Err(nu_protocol::shell_error::io::ErrorKind::from_std( Err(std::io::ErrorKind::IsADirectory.into())
std::io::ErrorKind::IsADirectory,
))
} else { } else {
std::fs::File::create(path).map_err(|err| err.into()) std::fs::File::create(path)
} }
#[cfg(not(target_os = "windows"))] #[cfg(not(target_os = "windows"))]
std::fs::File::create(path).map_err(|err| err.into()) std::fs::File::create(path)
} }
}; };
file.map_err(|err_kind| ShellError::Io(IoError::new(err_kind, span, PathBuf::from(path)))) match file {
Ok(file) => Ok(file),
Err(err) => {
// In caase of NotFound, search for the missing parent directory.
// This also presents a TOCTOU (or TOUTOC, technically?)
if err.kind() == std::io::ErrorKind::NotFound {
if let Some(missing_component) =
path.ancestors().skip(1).filter(|dir| !dir.exists()).last()
{
// By looking at the postfix to remove, rather than the prefix
// to keep, we are able to handle relative paths too.
let components_to_remove = path
.strip_prefix(missing_component)
.expect("Stripping ancestor from a path should never fail")
.as_os_str()
.as_encoded_bytes();
return Err(ShellError::Io(IoError::new(
ErrorKind::DirectoryNotFound,
engine_state
.span_match_postfix(span, components_to_remove)
.map(|(pre, _post)| pre)
.unwrap_or(span),
PathBuf::from(missing_component),
)));
}
}
Err(ShellError::Io(IoError::new(err, span, PathBuf::from(path))))
}
}
} }
/// Get output file and optional stderr file /// Get output file and optional stderr file
fn get_files( fn get_files(
engine_state: &EngineState,
path: &Spanned<PathBuf>, path: &Spanned<PathBuf>,
stderr_path: Option<&Spanned<PathBuf>>, stderr_path: Option<&Spanned<PathBuf>>,
append: bool, append: bool,
@ -467,7 +500,7 @@ fn get_files(
.transpose()?; .transpose()?;
// Only if both files can be used open and possibly truncate them // Only if both files can be used open and possibly truncate them
let file = open_file(path, path_span, append)?; let file = open_file(engine_state, path, path_span, append)?;
let stderr_file = stderr_path_and_span let stderr_file = stderr_path_and_span
.map(|(stderr_path, stderr_path_span)| { .map(|(stderr_path, stderr_path_span)| {
@ -480,7 +513,7 @@ fn get_files(
inner: vec![], inner: vec![],
}) })
} else { } else {
open_file(stderr_path, stderr_path_span, append) open_file(engine_state, stderr_path, stderr_path_span, append)
} }
}) })
.transpose()?; .transpose()?;

View File

@ -49,7 +49,8 @@ impl Command for Start {
} }
// If it's not a URL, treat it as a file path // If it's not a URL, treat it as a file path
let cwd = engine_state.cwd(Some(stack))?; let cwd = engine_state.cwd(Some(stack))?;
let full_path = cwd.join(path_no_whitespace); let full_path = nu_path::expand_path_with(path_no_whitespace, &cwd, true);
// Check if the path exists or if it's a valid file/directory // Check if the path exists or if it's a valid file/directory
if full_path.exists() { if full_path.exists() {
open_path(full_path, engine_state, stack, path.span)?; open_path(full_path, engine_state, stack, path.span)?;

View File

@ -272,8 +272,8 @@ impl Command for UCp {
uu_cp::Error::NotAllFilesCopied => {} uu_cp::Error::NotAllFilesCopied => {}
_ => { _ => {
return Err(ShellError::GenericError { return Err(ShellError::GenericError {
error: format!("{}", error), error: format!("{error}"),
msg: format!("{}", error), msg: format!("{error}"),
span: None, span: None,
help: None, help: None,
inner: vec![], inner: vec![],
@ -373,7 +373,7 @@ fn parse_and_set_attribute(
"xattr" => &mut attribute.xattr, "xattr" => &mut attribute.xattr,
_ => { _ => {
return Err(ShellError::IncompatibleParametersSingle { return Err(ShellError::IncompatibleParametersSingle {
msg: format!("--preserve flag got an unexpected attribute \"{}\"", val), msg: format!("--preserve flag got an unexpected attribute \"{val}\""),
span: value.span(), span: value.span(),
}); });
} }

View File

@ -77,8 +77,8 @@ impl Command for UMkdir {
for dir in directories { for dir in directories {
if let Err(error) = mkdir(&dir, IS_RECURSIVE, get_mode(), is_verbose) { if let Err(error) = mkdir(&dir, IS_RECURSIVE, get_mode(), is_verbose) {
return Err(ShellError::GenericError { return Err(ShellError::GenericError {
error: format!("{}", error), error: format!("{error}"),
msg: format!("{}", error), msg: format!("{error}"),
span: None, span: None,
help: None, help: None,
inner: vec![], inner: vec![],

View File

@ -195,8 +195,8 @@ impl Command for UMv {
}; };
if let Err(error) = uu_mv::mv(&files, &options) { if let Err(error) = uu_mv::mv(&files, &options) {
return Err(ShellError::GenericError { return Err(ShellError::GenericError {
error: format!("{}", error), error: format!("{error}"),
msg: format!("{}", error), msg: format!("{error}"),
span: None, span: None,
help: None, help: None,
inner: Vec::new(), inner: Vec::new(),

View File

@ -220,7 +220,7 @@ impl Command for UTouch {
inner: Vec::new(), inner: Vec::new(),
}, },
TouchError::InvalidDateFormat(date) => ShellError::IncorrectValue { TouchError::InvalidDateFormat(date) => ShellError::IncorrectValue {
msg: format!("Invalid date: {}", date), msg: format!("Invalid date: {date}"),
val_span: date_span.expect("touch should've been given a date"), val_span: date_span.expect("touch should've been given a date"),
call_span: call.head, call_span: call.head,
}, },

View File

@ -8,7 +8,7 @@ use notify_debouncer_full::{
use nu_engine::{ClosureEval, command_prelude::*}; use nu_engine::{ClosureEval, command_prelude::*};
use nu_protocol::{ use nu_protocol::{
engine::{Closure, StateWorkingSet}, engine::{Closure, StateWorkingSet},
format_shell_error, format_cli_error,
shell_error::io::IoError, shell_error::io::IoError,
}; };
use std::{ use std::{
@ -208,7 +208,7 @@ impl Command for Watch {
} }
Err(err) => { Err(err) => {
let working_set = StateWorkingSet::new(engine_state); let working_set = StateWorkingSet::new(engine_state);
eprintln!("{}", format_shell_error(&working_set, &err)); eprintln!("{}", format_cli_error(&working_set, &err));
} }
} }
} }

View File

@ -213,7 +213,7 @@ fn default(
|| (default_when_empty || (default_when_empty
&& matches!(input, PipelineData::Value(ref value, _) if value.is_empty())) && matches!(input, PipelineData::Value(ref value, _) if value.is_empty()))
{ {
default_value.pipeline_data() default_value.single_run_pipeline_data()
} else if default_when_empty && matches!(input, PipelineData::ListStream(..)) { } else if default_when_empty && matches!(input, PipelineData::ListStream(..)) {
let PipelineData::ListStream(ls, metadata) = input else { let PipelineData::ListStream(ls, metadata) = input else {
unreachable!() unreachable!()
@ -221,7 +221,7 @@ fn default(
let span = ls.span(); let span = ls.span();
let mut stream = ls.into_inner().peekable(); let mut stream = ls.into_inner().peekable();
if stream.peek().is_none() { if stream.peek().is_none() {
return default_value.pipeline_data(); return default_value.single_run_pipeline_data();
} }
// stream's internal state already preserves the original signals config, so if this // stream's internal state already preserves the original signals config, so if this
@ -278,8 +278,14 @@ impl DefaultValue {
} }
} }
fn pipeline_data(&mut self) -> Result<PipelineData, ShellError> { /// Used when we know the value won't need to be cached to allow streaming.
self.value().map(|x| x.into_pipeline_data()) fn single_run_pipeline_data(self) -> Result<PipelineData, ShellError> {
match self {
DefaultValue::Uncalculated(mut closure) => {
closure.item.run_with_input(PipelineData::Empty)
}
DefaultValue::Calculated(val) => Ok(val.into_pipeline_data()),
}
} }
} }
@ -328,7 +334,7 @@ fn closure_variable_warning(
Cow::Owned(s) if s.deref() == "$carapace_completer" => { Cow::Owned(s) if s.deref() == "$carapace_completer" => {
carapace_suggestion.to_string() carapace_suggestion.to_string()
} }
_ => format!("change this to {{ {} }}", span_contents).to_string(), _ => format!("change this to {{ {span_contents} }}").to_string(),
}; };
report_shell_warning( report_shell_warning(

View File

@ -1,6 +1,6 @@
use itertools::Either;
use nu_engine::command_prelude::*; use nu_engine::command_prelude::*;
use nu_protocol::{PipelineIterator, Range}; use nu_protocol::{PipelineIterator, Range};
use std::collections::VecDeque;
use std::ops::Bound; use std::ops::Bound;
#[derive(Clone)] #[derive(Clone)]
@ -18,13 +18,11 @@ impl Command for DropNth {
(Type::list(Type::Any), Type::list(Type::Any)), (Type::list(Type::Any), Type::list(Type::Any)),
]) ])
.allow_variants_without_examples(true) .allow_variants_without_examples(true)
.required( .rest(
"row number or row range", "rest",
// FIXME: we can make this accept either Int or Range when we can compose SyntaxShapes
SyntaxShape::Any, SyntaxShape::Any,
"The number of the row to drop or a range to drop consecutive rows.", "The row numbers or ranges to drop.",
) )
.rest("rest", SyntaxShape::Any, "The number of the row to drop.")
.category(Category::Filters) .category(Category::Filters)
} }
@ -103,110 +101,125 @@ impl Command for DropNth {
) -> Result<PipelineData, ShellError> { ) -> Result<PipelineData, ShellError> {
let head = call.head; let head = call.head;
let metadata = input.metadata(); let metadata = input.metadata();
let number_or_range = extract_int_or_range(engine_state, stack, call)?;
let rows = match number_or_range.item { let args: Vec<Value> = call.rest(engine_state, stack, 0)?;
Either::Left(row_number) => { if args.is_empty() {
let and_rows: Vec<Spanned<i64>> = call.rest(engine_state, stack, 1)?; return Ok(input);
let mut rows: Vec<_> = and_rows.into_iter().map(|x| x.item as usize).collect();
rows.push(row_number as usize);
rows.sort_unstable();
rows
}
Either::Right(Range::FloatRange(_)) => {
return Err(ShellError::UnsupportedInput {
msg: "float range".into(),
input: "value originates from here".into(),
msg_span: head,
input_span: number_or_range.span,
});
}
Either::Right(Range::IntRange(range)) => {
// check for negative range inputs, e.g., (2..-5)
let end_negative = match range.end() {
Bound::Included(end) | Bound::Excluded(end) => end < 0,
Bound::Unbounded => false,
};
if range.start().is_negative() || end_negative {
return Err(ShellError::UnsupportedInput {
msg: "drop nth accepts only positive ints".into(),
input: "value originates from here".into(),
msg_span: head,
input_span: number_or_range.span,
});
}
// check if the upper bound is smaller than the lower bound, e.g., do not accept 4..2
if range.step() < 0 {
return Err(ShellError::UnsupportedInput {
msg: "The upper bound needs to be equal or larger to the lower bound"
.into(),
input: "value originates from here".into(),
msg_span: head,
input_span: number_or_range.span,
});
} }
let start = range.start() as usize; let (rows_to_drop, min_unbounded_start) = get_rows_to_drop(&args, head)?;
let end = match range.end() { let input = if let Some(cutoff) = min_unbounded_start {
Bound::Included(end) => end as usize, input
Bound::Excluded(end) => (end - 1) as usize,
Bound::Unbounded => {
return Ok(input
.into_iter() .into_iter()
.take(start) .take(cutoff)
.into_pipeline_data_with_metadata( .into_pipeline_data_with_metadata(
head, head,
engine_state.signals().clone(), engine_state.signals().clone(),
metadata, metadata.clone(),
)); )
}
};
let end = if let PipelineData::Value(Value::List { vals, .. }, _) = &input {
end.min(vals.len() - 1)
} else { } else {
end input
};
(start..=end).collect()
}
}; };
Ok(DropNthIterator { Ok(DropNthIterator {
input: input.into_iter(), input: input.into_iter(),
rows, rows: rows_to_drop,
current: 0, current: 0,
} }
.into_pipeline_data_with_metadata(head, engine_state.signals().clone(), metadata)) .into_pipeline_data_with_metadata(head, engine_state.signals().clone(), metadata))
} }
} }
fn extract_int_or_range( fn get_rows_to_drop(
engine_state: &EngineState, args: &[Value],
stack: &mut Stack, head: Span,
call: &Call, ) -> Result<(VecDeque<usize>, Option<usize>), ShellError> {
) -> Result<Spanned<Either<i64, Range>>, ShellError> { let mut rows_to_drop = Vec::new();
let value: Value = call.req(engine_state, stack, 0)?; let mut min_unbounded_start: Option<usize> = None;
let int_opt = value.as_int().map(Either::Left).ok(); for value in args {
let range_opt = value.as_range().map(Either::Right).ok(); if let Ok(i) = value.as_int() {
if i < 0 {
return Err(ShellError::UnsupportedInput {
msg: "drop nth accepts only positive ints".into(),
input: "value originates from here".into(),
msg_span: head,
input_span: value.span(),
});
}
rows_to_drop.push(i as usize);
} else if let Ok(range) = value.as_range() {
match range {
Range::IntRange(range) => {
let start = range.start();
if start < 0 {
return Err(ShellError::UnsupportedInput {
msg: "drop nth accepts only positive ints".into(),
input: "value originates from here".into(),
msg_span: head,
input_span: value.span(),
});
}
int_opt match range.end() {
.or(range_opt) Bound::Included(end) => {
.ok_or_else(|| ShellError::TypeMismatch { if end < start {
err_message: "int or range".into(), return Err(ShellError::UnsupportedInput {
msg: "The upper bound must be greater than or equal to the lower bound".into(),
input: "value originates from here".into(),
msg_span: head,
input_span: value.span(),
});
}
rows_to_drop.extend((start as usize)..=(end as usize));
}
Bound::Excluded(end) => {
if end <= start {
return Err(ShellError::UnsupportedInput {
msg: "The upper bound must be greater than the lower bound"
.into(),
input: "value originates from here".into(),
msg_span: head,
input_span: value.span(),
});
}
rows_to_drop.extend((start as usize)..(end as usize));
}
Bound::Unbounded => {
let start_usize = start as usize;
min_unbounded_start = Some(
min_unbounded_start.map_or(start_usize, |s| s.min(start_usize)),
);
}
}
}
Range::FloatRange(_) => {
return Err(ShellError::UnsupportedInput {
msg: "float range not supported".into(),
input: "value originates from here".into(),
msg_span: head,
input_span: value.span(),
});
}
}
} else {
return Err(ShellError::TypeMismatch {
err_message: "Expected int or range".into(),
span: value.span(), span: value.span(),
}) });
.map(|either| Spanned { }
item: either, }
span: value.span(),
}) rows_to_drop.sort_unstable();
rows_to_drop.dedup();
Ok((VecDeque::from(rows_to_drop), min_unbounded_start))
} }
struct DropNthIterator { struct DropNthIterator {
input: PipelineIterator, input: PipelineIterator,
rows: Vec<usize>, rows: VecDeque<usize>,
current: usize, current: usize,
} }
@ -215,9 +228,9 @@ impl Iterator for DropNthIterator {
fn next(&mut self) -> Option<Self::Item> { fn next(&mut self) -> Option<Self::Item> {
loop { loop {
if let Some(row) = self.rows.first() { if let Some(row) = self.rows.front() {
if self.current == *row { if self.current == *row {
self.rows.remove(0); self.rows.pop_front();
self.current += 1; self.current += 1;
let _ = self.input.next(); let _ = self.input.next();
continue; continue;

View File

@ -163,7 +163,12 @@ impl Command for Find {
example: r#"[["Larry", "Moe"], ["Victor", "Marina"]] | find --regex "rr""#, example: r#"[["Larry", "Moe"], ["Victor", "Marina"]] | find --regex "rr""#,
result: Some(Value::list( result: Some(Value::list(
vec![Value::list( vec![Value::list(
vec![Value::test_string("Larry"), Value::test_string("Moe")], vec![
Value::test_string(
"\u{1b}[37mLa\u{1b}[0m\u{1b}[41;37mrr\u{1b}[0m\u{1b}[37my\u{1b}[0m",
),
Value::test_string("Moe"),
],
Span::test_data(), Span::test_data(),
)], )],
Span::test_data(), Span::test_data(),
@ -344,7 +349,10 @@ fn get_match_pattern_from_arguments(
// map functions // map functions
fn highlight_matches_in_string(pattern: &MatchPattern, val: String) -> String { fn highlight_matches_in_string(pattern: &MatchPattern, val: String) -> String {
// strip haystack to remove existing ansi style if !pattern.regex.is_match(&val).unwrap_or(false) {
return val;
}
let stripped_val = nu_utils::strip_ansi_string_unlikely(val); let stripped_val = nu_utils::strip_ansi_string_unlikely(val);
let mut last_match_end = 0; let mut last_match_end = 0;
let mut highlighted = String::new(); let mut highlighted = String::new();
@ -390,7 +398,7 @@ fn highlight_matches_in_string(pattern: &MatchPattern, val: String) -> String {
highlighted highlighted
} }
fn highlight_matches_in_record_or_value( fn highlight_matches_in_value(
pattern: &MatchPattern, pattern: &MatchPattern,
value: Value, value: Value,
columns_to_search: &[String], columns_to_search: &[String],
@ -412,16 +420,16 @@ fn highlight_matches_in_record_or_value(
continue; continue;
} }
if let Value::String { val: val_str, .. } = val { *val = highlight_matches_in_value(pattern, std::mem::take(val), &[]);
if pattern.regex.is_match(val_str).unwrap_or(false) {
let val_str = std::mem::take(val_str);
*val = highlight_matches_in_string(pattern, val_str).into_value(span)
}
}
} }
Value::record(record, span) Value::record(record, span)
} }
Value::List { vals, .. } => vals
.into_iter()
.map(|item| highlight_matches_in_value(pattern, item, &[]))
.collect::<Vec<Value>>()
.into_value(span),
Value::String { val, .. } => highlight_matches_in_string(pattern, val).into_value(span), Value::String { val, .. } => highlight_matches_in_string(pattern, val).into_value(span),
_ => value, _ => value,
} }
@ -444,24 +452,22 @@ fn find_in_pipelinedata(
PipelineData::Value(_, _) => input PipelineData::Value(_, _) => input
.filter( .filter(
move |value| { move |value| {
record_or_value_should_be_printed(&pattern, value, &columns_to_search, &config) value_should_be_printed(&pattern, value, &columns_to_search, &config)
!= pattern.invert
}, },
engine_state.signals(), engine_state.signals(),
)? )?
.map( .map(
move |x| { move |x| highlight_matches_in_value(&map_pattern, x, &map_columns_to_search),
highlight_matches_in_record_or_value(&map_pattern, x, &map_columns_to_search)
},
engine_state.signals(), engine_state.signals(),
), ),
PipelineData::ListStream(stream, metadata) => { PipelineData::ListStream(stream, metadata) => {
let stream = stream.modify(|iter| { let stream = stream.modify(|iter| {
iter.filter(move |value| { iter.filter(move |value| {
record_or_value_should_be_printed(&pattern, value, &columns_to_search, &config) value_should_be_printed(&pattern, value, &columns_to_search, &config)
}) != pattern.invert
.map(move |x| {
highlight_matches_in_record_or_value(&map_pattern, x, &map_columns_to_search)
}) })
.map(move |x| highlight_matches_in_value(&map_pattern, x, &map_columns_to_search))
}); });
Ok(PipelineData::ListStream(stream, metadata)) Ok(PipelineData::ListStream(stream, metadata))
@ -495,7 +501,12 @@ fn string_should_be_printed(pattern: &MatchPattern, value: &str) -> bool {
pattern.regex.is_match(value).unwrap_or(false) pattern.regex.is_match(value).unwrap_or(false)
} }
fn value_should_be_printed(pattern: &MatchPattern, value: &Value, config: &Config) -> bool { fn value_should_be_printed(
pattern: &MatchPattern,
value: &Value,
columns_to_search: &[String],
config: &Config,
) -> bool {
let lower_value = value.to_expanded_string("", config).to_lowercase(); let lower_value = value.to_expanded_string("", config).to_lowercase();
match value { match value {
@ -507,8 +518,7 @@ fn value_should_be_printed(pattern: &MatchPattern, value: &Value, config: &Confi
| Value::Range { .. } | Value::Range { .. }
| Value::Float { .. } | Value::Float { .. }
| Value::Closure { .. } | Value::Closure { .. }
| Value::Nothing { .. } | Value::Nothing { .. } => {
| Value::Error { .. } => {
if !pattern.lower_terms.is_empty() { if !pattern.lower_terms.is_empty() {
// look for exact match when searching with terms // look for exact match when searching with terms
pattern pattern
@ -519,37 +529,25 @@ fn value_should_be_printed(pattern: &MatchPattern, value: &Value, config: &Confi
string_should_be_printed(pattern, &lower_value) string_should_be_printed(pattern, &lower_value)
} }
} }
Value::Glob { .. } Value::Glob { .. } | Value::CellPath { .. } | Value::Custom { .. } => {
| Value::List { .. } string_should_be_printed(pattern, &lower_value)
| Value::CellPath { .. } }
| Value::Record { .. }
| Value::Custom { .. } => string_should_be_printed(pattern, &lower_value),
Value::String { val, .. } => string_should_be_printed(pattern, val), Value::String { val, .. } => string_should_be_printed(pattern, val),
Value::Binary { .. } => false, Value::List { vals, .. } => vals
} .iter()
} .any(|item| value_should_be_printed(pattern, item, &[], config)),
fn record_or_value_should_be_printed(
pattern: &MatchPattern,
value: &Value,
columns_to_search: &[String],
config: &Config,
) -> bool {
let match_found = match value {
Value::Record { val: record, .. } => { Value::Record { val: record, .. } => {
// Only perform column selection if given columns.
let col_select = !columns_to_search.is_empty(); let col_select = !columns_to_search.is_empty();
record.iter().any(|(col, val)| { record.iter().any(|(col, val)| {
if col_select && !columns_to_search.contains(col) { if col_select && !columns_to_search.contains(col) {
return false; return false;
} }
value_should_be_printed(pattern, val, config) value_should_be_printed(pattern, val, &[], config)
}) })
} }
_ => value_should_be_printed(pattern, value, config), Value::Binary { .. } => false,
}; Value::Error { .. } => true,
}
match_found != pattern.invert
} }
// utility // utility
@ -574,6 +572,46 @@ fn split_string_if_multiline(input: PipelineData, head_span: Span) -> PipelineDa
} }
} }
/// function for using find from other commands
pub fn find_internal(
input: PipelineData,
engine_state: &EngineState,
stack: &mut Stack,
search_term: &str,
columns_to_search: &[&str],
highlight: bool,
) -> Result<PipelineData, ShellError> {
let span = input.span().unwrap_or(Span::unknown());
let style_computer = StyleComputer::from_config(engine_state, stack);
let string_style = style_computer.compute("string", &Value::string("search result", span));
let highlight_style =
style_computer.compute("search_result", &Value::string("search result", span));
let regex_str = format!("(?i){}", escape(search_term));
let regex = Regex::new(regex_str.as_str()).map_err(|e| ShellError::TypeMismatch {
err_message: format!("invalid regex: {e}"),
span: Span::unknown(),
})?;
let pattern = MatchPattern {
regex,
lower_terms: vec![search_term.to_lowercase()],
highlight,
invert: false,
string_style,
highlight_style,
};
let columns_to_search = columns_to_search
.iter()
.map(|str| String::from(*str))
.collect();
find_in_pipelinedata(pattern, columns_to_search, engine_state, stack, input)
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*; use super::*;

View File

@ -379,10 +379,7 @@ fn merge_records(left: &Record, right: &Record, shared_key: Option<&str>) -> Rec
let k_shared = shared_key == Some(k.as_str()); let k_shared = shared_key == Some(k.as_str());
// Do not output shared join key twice // Do not output shared join key twice
if !(k_seen && k_shared) { if !(k_seen && k_shared) {
record.push( record.push(if k_seen { format!("{k}_") } else { k.clone() }, v.clone());
if k_seen { format!("{}_", k) } else { k.clone() },
v.clone(),
);
} }
} }
record record

View File

@ -70,7 +70,7 @@ pub use empty::empty;
pub use enumerate::Enumerate; pub use enumerate::Enumerate;
pub use every::Every; pub use every::Every;
pub use filter::Filter; pub use filter::Filter;
pub use find::Find; pub use find::{Find, find_internal};
pub use first::First; pub use first::First;
pub use flatten::Flatten; pub use flatten::Flatten;
pub use get::Get; pub use get::Get;

View File

@ -203,6 +203,7 @@ mod test {
use super::*; use super::*;
use crate::Reject;
use crate::{Metadata, MetadataSet}; use crate::{Metadata, MetadataSet};
#[test] #[test]
@ -221,6 +222,7 @@ mod test {
working_set.add_decl(Box::new(FromCsv {})); working_set.add_decl(Box::new(FromCsv {}));
working_set.add_decl(Box::new(Metadata {})); working_set.add_decl(Box::new(Metadata {}));
working_set.add_decl(Box::new(MetadataSet {})); working_set.add_decl(Box::new(MetadataSet {}));
working_set.add_decl(Box::new(Reject {}));
working_set.render() working_set.render()
}; };
@ -229,7 +231,7 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = r#""a,b\n1,2" | metadata set --content-type 'text/csv' --datasource-ls | from csv | metadata | $in"#; let cmd = r#""a,b\n1,2" | metadata set --content-type 'text/csv' --datasource-ls | from csv | metadata | reject span | $in"#;
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),

View File

@ -248,6 +248,7 @@ fn convert_string_to_value_strict(string_input: &str, span: Span) -> Result<Valu
mod test { mod test {
use nu_cmd_lang::eval_pipeline_without_terminal_expression; use nu_cmd_lang::eval_pipeline_without_terminal_expression;
use crate::Reject;
use crate::{Metadata, MetadataSet}; use crate::{Metadata, MetadataSet};
use super::*; use super::*;
@ -268,6 +269,7 @@ mod test {
working_set.add_decl(Box::new(FromJson {})); working_set.add_decl(Box::new(FromJson {}));
working_set.add_decl(Box::new(Metadata {})); working_set.add_decl(Box::new(Metadata {}));
working_set.add_decl(Box::new(MetadataSet {})); working_set.add_decl(Box::new(MetadataSet {}));
working_set.add_decl(Box::new(Reject {}));
working_set.render() working_set.render()
}; };
@ -276,7 +278,7 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = r#"'{"a":1,"b":2}' | metadata set --content-type 'application/json' --datasource-ls | from json | metadata | $in"#; let cmd = r#"'{"a":1,"b":2}' | metadata set --content-type 'application/json' --datasource-ls | from json | metadata | reject span | $in"#;
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),

View File

@ -212,7 +212,7 @@ impl From<ReadError> for ShellError {
}, },
ReadError::TypeMismatch(marker, span) => ShellError::GenericError { ReadError::TypeMismatch(marker, span) => ShellError::GenericError {
error: "Invalid marker while reading MessagePack data".into(), error: "Invalid marker while reading MessagePack data".into(),
msg: format!("unexpected {:?} in data", marker), msg: format!("unexpected {marker:?} in data"),
span: Some(span), span: Some(span),
help: None, help: None,
inner: vec![], inner: vec![],
@ -514,6 +514,7 @@ fn assert_eof(input: &mut impl io::Read, span: Span) -> Result<(), ShellError> {
mod test { mod test {
use nu_cmd_lang::eval_pipeline_without_terminal_expression; use nu_cmd_lang::eval_pipeline_without_terminal_expression;
use crate::Reject;
use crate::{Metadata, MetadataSet, ToMsgpack}; use crate::{Metadata, MetadataSet, ToMsgpack};
use super::*; use super::*;
@ -535,6 +536,7 @@ mod test {
working_set.add_decl(Box::new(FromMsgpack {})); working_set.add_decl(Box::new(FromMsgpack {}));
working_set.add_decl(Box::new(Metadata {})); working_set.add_decl(Box::new(Metadata {}));
working_set.add_decl(Box::new(MetadataSet {})); working_set.add_decl(Box::new(MetadataSet {}));
working_set.add_decl(Box::new(Reject {}));
working_set.render() working_set.render()
}; };
@ -543,7 +545,7 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = r#"{a: 1 b: 2} | to msgpack | metadata set --datasource-ls | from msgpack | metadata | $in"#; let cmd = r#"{a: 1 b: 2} | to msgpack | metadata set --datasource-ls | from msgpack | metadata | reject span | $in"#;
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),

View File

@ -74,6 +74,7 @@ impl Command for FromNuon {
mod test { mod test {
use nu_cmd_lang::eval_pipeline_without_terminal_expression; use nu_cmd_lang::eval_pipeline_without_terminal_expression;
use crate::Reject;
use crate::{Metadata, MetadataSet}; use crate::{Metadata, MetadataSet};
use super::*; use super::*;
@ -94,6 +95,7 @@ mod test {
working_set.add_decl(Box::new(FromNuon {})); working_set.add_decl(Box::new(FromNuon {}));
working_set.add_decl(Box::new(Metadata {})); working_set.add_decl(Box::new(Metadata {}));
working_set.add_decl(Box::new(MetadataSet {})); working_set.add_decl(Box::new(MetadataSet {}));
working_set.add_decl(Box::new(Reject {}));
working_set.render() working_set.render()
}; };
@ -102,7 +104,7 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = r#"'[[a, b]; [1, 2]]' | metadata set --content-type 'application/x-nuon' --datasource-ls | from nuon | metadata | $in"#; let cmd = r#"'[[a, b]; [1, 2]]' | metadata set --content-type 'application/x-nuon' --datasource-ls | from nuon | metadata | reject span | $in"#;
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),

View File

@ -167,7 +167,7 @@ fn parse_aligned_columns<'a>(
let headers: Vec<(String, usize)> = indices let headers: Vec<(String, usize)> = indices
.iter() .iter()
.enumerate() .enumerate()
.map(|(i, position)| (format!("column{}", i), *position)) .map(|(i, position)| (format!("column{i}"), *position))
.collect(); .collect();
construct(ls.iter().map(|s| s.to_owned()), headers) construct(ls.iter().map(|s| s.to_owned()), headers)

View File

@ -145,6 +145,7 @@ pub fn convert_string_to_value(string_input: String, span: Span) -> Result<Value
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use crate::Reject;
use crate::{Metadata, MetadataSet}; use crate::{Metadata, MetadataSet};
use super::*; use super::*;
@ -345,6 +346,7 @@ mod tests {
working_set.add_decl(Box::new(FromToml {})); working_set.add_decl(Box::new(FromToml {}));
working_set.add_decl(Box::new(Metadata {})); working_set.add_decl(Box::new(Metadata {}));
working_set.add_decl(Box::new(MetadataSet {})); working_set.add_decl(Box::new(MetadataSet {}));
working_set.add_decl(Box::new(Reject {}));
working_set.render() working_set.render()
}; };
@ -353,7 +355,7 @@ mod tests {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = r#""[a]\nb = 1\nc = 1" | metadata set --content-type 'text/x-toml' --datasource-ls | from toml | metadata | $in"#; let cmd = r#""[a]\nb = 1\nc = 1" | metadata set --content-type 'text/x-toml' --datasource-ls | from toml | metadata | reject span | $in"#;
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),

View File

@ -160,6 +160,7 @@ fn from_tsv(
mod test { mod test {
use nu_cmd_lang::eval_pipeline_without_terminal_expression; use nu_cmd_lang::eval_pipeline_without_terminal_expression;
use crate::Reject;
use crate::{Metadata, MetadataSet}; use crate::{Metadata, MetadataSet};
use super::*; use super::*;
@ -180,6 +181,7 @@ mod test {
working_set.add_decl(Box::new(FromTsv {})); working_set.add_decl(Box::new(FromTsv {}));
working_set.add_decl(Box::new(Metadata {})); working_set.add_decl(Box::new(Metadata {}));
working_set.add_decl(Box::new(MetadataSet {})); working_set.add_decl(Box::new(MetadataSet {}));
working_set.add_decl(Box::new(Reject {}));
working_set.render() working_set.render()
}; };
@ -188,7 +190,7 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = r#""a\tb\n1\t2" | metadata set --content-type 'text/tab-separated-values' --datasource-ls | from tsv | metadata | $in"#; let cmd = r#""a\tb\n1\t2" | metadata set --content-type 'text/tab-separated-values' --datasource-ls | from tsv | metadata | reject span | $in"#;
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),

View File

@ -252,7 +252,7 @@ fn process_xml_parse_error(source: String, err: roxmltree::Error, span: Span) ->
pos, pos,
), ),
roxmltree::Error::UnknownNamespace(prefix, pos) => { roxmltree::Error::UnknownNamespace(prefix, pos) => {
make_xml_error_spanned(format!("Unknown prefix {}", prefix), source, pos) make_xml_error_spanned(format!("Unknown prefix {prefix}"), source, pos)
} }
roxmltree::Error::UnexpectedCloseTag(expected, actual, pos) => make_xml_error_spanned( roxmltree::Error::UnexpectedCloseTag(expected, actual, pos) => make_xml_error_spanned(
format!("Unexpected close tag {actual}, expected {expected}"), format!("Unexpected close tag {actual}, expected {expected}"),
@ -370,6 +370,7 @@ fn make_xml_error_spanned(msg: impl Into<String>, src: String, pos: TextPos) ->
mod tests { mod tests {
use crate::Metadata; use crate::Metadata;
use crate::MetadataSet; use crate::MetadataSet;
use crate::Reject;
use super::*; use super::*;
@ -541,6 +542,7 @@ mod tests {
working_set.add_decl(Box::new(FromXml {})); working_set.add_decl(Box::new(FromXml {}));
working_set.add_decl(Box::new(Metadata {})); working_set.add_decl(Box::new(Metadata {}));
working_set.add_decl(Box::new(MetadataSet {})); working_set.add_decl(Box::new(MetadataSet {}));
working_set.add_decl(Box::new(Reject {}));
working_set.render() working_set.render()
}; };
@ -552,7 +554,7 @@ mod tests {
let cmd = r#"'<?xml version="1.0" encoding="UTF-8"?> let cmd = r#"'<?xml version="1.0" encoding="UTF-8"?>
<note> <note>
<remember>Event</remember> <remember>Event</remember>
</note>' | metadata set --content-type 'application/xml' --datasource-ls | from xml | metadata | $in"#; </note>' | metadata set --content-type 'application/xml' --datasource-ls | from xml | metadata | reject span | $in"#;
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),

View File

@ -158,27 +158,26 @@ fn convert_yaml_value_to_nu_value(
} }
serde_yaml::Value::Tagged(t) => { serde_yaml::Value::Tagged(t) => {
let tag = &t.tag; let tag = &t.tag;
let value = match &t.value {
match &t.value {
serde_yaml::Value::String(s) => { serde_yaml::Value::String(s) => {
let val = format!("{} {}", tag, s).trim().to_string(); let val = format!("{tag} {s}").trim().to_string();
Value::string(val, span) Value::string(val, span)
} }
serde_yaml::Value::Number(n) => { serde_yaml::Value::Number(n) => {
let val = format!("{} {}", tag, n).trim().to_string(); let val = format!("{tag} {n}").trim().to_string();
Value::string(val, span) Value::string(val, span)
} }
serde_yaml::Value::Bool(b) => { serde_yaml::Value::Bool(b) => {
let val = format!("{} {}", tag, b).trim().to_string(); let val = format!("{tag} {b}").trim().to_string();
Value::string(val, span) Value::string(val, span)
} }
serde_yaml::Value::Null => { serde_yaml::Value::Null => {
let val = format!("{}", tag).trim().to_string(); let val = format!("{tag}").trim().to_string();
Value::string(val, span) Value::string(val, span)
} }
v => convert_yaml_value_to_nu_value(v, span, val_span)?, v => convert_yaml_value_to_nu_value(v, span, val_span)?,
}; }
value
} }
serde_yaml::Value::Null => Value::nothing(span), serde_yaml::Value::Null => Value::nothing(span),
x => unimplemented!("Unsupported YAML case: {:?}", x), x => unimplemented!("Unsupported YAML case: {:?}", x),
@ -244,6 +243,7 @@ fn from_yaml(input: PipelineData, head: Span) -> Result<PipelineData, ShellError
#[cfg(test)] #[cfg(test)]
mod test { mod test {
use crate::Reject;
use crate::{Metadata, MetadataSet}; use crate::{Metadata, MetadataSet};
use super::*; use super::*;
@ -410,6 +410,7 @@ mod test {
working_set.add_decl(Box::new(FromYaml {})); working_set.add_decl(Box::new(FromYaml {}));
working_set.add_decl(Box::new(Metadata {})); working_set.add_decl(Box::new(Metadata {}));
working_set.add_decl(Box::new(MetadataSet {})); working_set.add_decl(Box::new(MetadataSet {}));
working_set.add_decl(Box::new(Reject {}));
working_set.render() working_set.render()
}; };
@ -418,7 +419,7 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = r#""a: 1\nb: 2" | metadata set --content-type 'application/yaml' --datasource-ls | from yaml | metadata | $in"#; let cmd = r#""a: 1\nb: 2" | metadata set --content-type 'application/yaml' --datasource-ls | from yaml | metadata | reject span | $in"#;
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),

View File

@ -166,14 +166,14 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = "{a: 1 b: 2} | to csv | metadata | get content_type"; let cmd = "{a: 1 b: 2} | to csv | metadata | get content_type | $in";
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),
&mut engine_state, &mut engine_state,
); );
assert_eq!( assert_eq!(
Value::test_record(record!("content_type" => Value::test_string("text/csv"))), Value::test_string("text/csv"),
result.expect("There should be a result") result.expect("There should be a result")
); );
} }

View File

@ -50,7 +50,7 @@ fn make_unsupported_input_error(
) -> ShellError { ) -> ShellError {
ShellError::UnsupportedInput { ShellError::UnsupportedInput {
msg: "expected table or record".to_string(), msg: "expected table or record".to_string(),
input: format!("input type: {}", r#type), input: format!("input type: {type}"),
msg_span: head, msg_span: head,
input_span: span, input_span: span,
} }

View File

@ -229,14 +229,14 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = "{a: 1 b: 2} | to json | metadata | get content_type"; let cmd = "{a: 1 b: 2} | to json | metadata | get content_type | $in";
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),
&mut engine_state, &mut engine_state,
); );
assert_eq!( assert_eq!(
Value::test_record(record!("content_type" => Value::test_string("application/json"))), Value::test_string("application/json"),
result.expect("There should be a result") result.expect("There should be a result")
); );
} }

View File

@ -246,7 +246,7 @@ fn table(
escaped_rows.push(escaped_row); escaped_rows.push(escaped_row);
} }
let output_string = if (column_widths.is_empty() || column_widths.iter().all(|x| *x == 0)) if (column_widths.is_empty() || column_widths.iter().all(|x| *x == 0))
&& escaped_rows.is_empty() && escaped_rows.is_empty()
{ {
String::from("") String::from("")
@ -260,9 +260,7 @@ fn table(
) )
.trim() .trim()
.to_string() .to_string()
}; }
output_string
} }
pub fn group_by(values: PipelineData, head: Span, config: &Config) -> (PipelineData, bool) { pub fn group_by(values: PipelineData, head: Span, config: &Config) -> (PipelineData, bool) {
@ -906,14 +904,14 @@ mod tests {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = "{a: 1 b: 2} | to md | metadata | get content_type"; let cmd = "{a: 1 b: 2} | to md | metadata | get content_type | $in";
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),
&mut engine_state, &mut engine_state,
); );
assert_eq!( assert_eq!(
Value::test_record(record!("content_type" => Value::test_string("text/markdown"))), Value::test_string("text/markdown"),
result.expect("There should be a result") result.expect("There should be a result")
); );
} }

View File

@ -348,16 +348,14 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = "{a: 1 b: 2} | to msgpack | metadata | get content_type"; let cmd = "{a: 1 b: 2} | to msgpack | metadata | get content_type | $in";
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),
&mut engine_state, &mut engine_state,
); );
assert_eq!( assert_eq!(
Value::test_record( Value::test_string("application/x-msgpack"),
record!("content_type" => Value::test_string("application/x-msgpack"))
),
result.expect("There should be a result") result.expect("There should be a result")
); );
} }

View File

@ -143,14 +143,14 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = "{a: 1 b: 2} | to nuon | metadata | get content_type"; let cmd = "{a: 1 b: 2} | to nuon | metadata | get content_type | $in";
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),
&mut engine_state, &mut engine_state,
); );
assert_eq!( assert_eq!(
Value::test_record(record!("content_type" => Value::test_string("application/x-nuon"))), Value::test_string("application/x-nuon"),
result.expect("There should be a result") result.expect("There should be a result")
); );
} }

View File

@ -2,6 +2,7 @@ use chrono::Datelike;
use chrono_humanize::HumanTime; use chrono_humanize::HumanTime;
use nu_engine::command_prelude::*; use nu_engine::command_prelude::*;
use nu_protocol::{ByteStream, PipelineMetadata, format_duration, shell_error::io::IoError}; use nu_protocol::{ByteStream, PipelineMetadata, format_duration, shell_error::io::IoError};
use nu_utils::ObviousFloat;
use std::io::Write; use std::io::Write;
const LINE_ENDING: &str = if cfg!(target_os = "windows") { const LINE_ENDING: &str = if cfg!(target_os = "windows") {
@ -164,7 +165,7 @@ fn local_into_string(
match value { match value {
Value::Bool { val, .. } => val.to_string(), Value::Bool { val, .. } => val.to_string(),
Value::Int { val, .. } => val.to_string(), Value::Int { val, .. } => val.to_string(),
Value::Float { val, .. } => val.to_string(), Value::Float { val, .. } => ObviousFloat(val).to_string(),
Value::Filesize { val, .. } => val.to_string(), Value::Filesize { val, .. } => val.to_string(),
Value::Duration { val, .. } => format_duration(val), Value::Duration { val, .. } => format_duration(val),
Value::Date { val, .. } => { Value::Date { val, .. } => {
@ -272,14 +273,14 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = "{a: 1 b: 2} | to text | metadata | get content_type"; let cmd = "{a: 1 b: 2} | to text | metadata | get content_type | $in";
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),
&mut engine_state, &mut engine_state,
); );
assert_eq!( assert_eq!(
Value::test_record(record!("content_type" => Value::test_string("text/plain"))), Value::test_string("text/plain"),
result.expect("There should be a result") result.expect("There should be a result")
); );
} }

View File

@ -132,16 +132,14 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = "{a: 1 b: 2} | to tsv | metadata | get content_type"; let cmd = "{a: 1 b: 2} | to tsv | metadata | get content_type | $in";
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),
&mut engine_state, &mut engine_state,
); );
assert_eq!( assert_eq!(
Value::test_record( Value::test_string("text/tab-separated-values"),
record!("content_type" => Value::test_string("text/tab-separated-values"))
),
result.expect("There should be a result") result.expect("There should be a result")
); );
} }

View File

@ -210,8 +210,7 @@ impl Job {
from_type: "record".into(), from_type: "record".into(),
span: entry_span, span: entry_span,
help: Some(format!( help: Some(format!(
"Invalid column \"{}\" in xml entry. Only \"{}\", \"{}\" and \"{}\" are permitted", "Invalid column \"{bad_column}\" in xml entry. Only \"{COLUMN_TAG_NAME}\", \"{COLUMN_ATTRS_NAME}\" and \"{COLUMN_CONTENT_NAME}\" are permitted"
bad_column, COLUMN_TAG_NAME, COLUMN_ATTRS_NAME, COLUMN_CONTENT_NAME
)), )),
}); });
} }
@ -399,7 +398,7 @@ impl Job {
}); });
} }
let content_text = format!("{} {}", tag, content); let content_text = format!("{tag} {content}");
// PI content must NOT be escaped // PI content must NOT be escaped
// https://www.w3.org/TR/xml/#sec-pi // https://www.w3.org/TR/xml/#sec-pi
let pi_content = BytesPI::new(content_text.as_str()); let pi_content = BytesPI::new(content_text.as_str());
@ -428,8 +427,7 @@ impl Job {
from_type: Type::record().to_string(), from_type: Type::record().to_string(),
span: tag_span, span: tag_span,
help: Some(format!( help: Some(format!(
"Incorrect tag name {}, tag name can not start with ! or ?", "Incorrect tag name {tag}, tag name can not start with ! or ?"
tag
)), )),
}); });
} }
@ -540,14 +538,14 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = "{tag: note attributes: {} content : [{tag: remember attributes: {} content : [{tag: null attributes: null content : Event}]}]} | to xml | metadata | get content_type"; let cmd = "{tag: note attributes: {} content : [{tag: remember attributes: {} content : [{tag: null attributes: null content : Event}]}]} | to xml | metadata | get content_type | $in";
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),
&mut engine_state, &mut engine_state,
); );
assert_eq!( assert_eq!(
Value::test_record(record!("content_type" => Value::test_string("application/xml"))), Value::test_string("application/xml"),
result.expect("There should be a result") result.expect("There should be a result")
); );
} }

View File

@ -233,14 +233,14 @@ mod test {
.merge_delta(delta) .merge_delta(delta)
.expect("Error merging delta"); .expect("Error merging delta");
let cmd = "{a: 1 b: 2} | to yaml | metadata | get content_type"; let cmd = "{a: 1 b: 2} | to yaml | metadata | get content_type | $in";
let result = eval_pipeline_without_terminal_expression( let result = eval_pipeline_without_terminal_expression(
cmd, cmd,
std::env::temp_dir().as_ref(), std::env::temp_dir().as_ref(),
&mut engine_state, &mut engine_state,
); );
assert_eq!( assert_eq!(
Value::test_record(record!("content_type" => Value::test_string("application/yaml"))), Value::test_string("application/yaml"),
result.expect("There should be a result") result.expect("There should be a result")
); );
} }

View File

@ -211,7 +211,7 @@ fn parse_closure_result(
} else { } else {
let error = ShellError::GenericError { let error = ShellError::GenericError {
error: "Invalid block return".into(), error: "Invalid block return".into(),
msg: format!("Unexpected record key '{}'", k), msg: format!("Unexpected record key '{k}'"),
span: Some(span), span: Some(span),
help: None, help: None,
inner: vec![], inner: vec![],

View File

@ -1,8 +1,5 @@
use crate::help::{help_aliases, help_commands, help_modules}; use crate::help::{help_aliases, help_commands, help_modules};
use fancy_regex::{Regex, escape};
use nu_ansi_term::Style;
use nu_engine::command_prelude::*; use nu_engine::command_prelude::*;
use nu_utils::IgnoreCaseExt;
#[derive(Clone)] #[derive(Clone)]
pub struct Help; pub struct Help;
@ -125,132 +122,3 @@ You can also learn more at https://www.nushell.sh/book/"#;
] ]
} }
} }
pub fn highlight_search_in_table(
table: Vec<Value>, // list of records
search_string: &str,
searched_cols: &[&str],
string_style: &Style,
highlight_style: &Style,
) -> Result<Vec<Value>, ShellError> {
let orig_search_string = search_string;
let search_string = search_string.to_folded_case();
let mut matches = vec![];
for mut value in table {
let Value::Record {
val: ref mut record,
..
} = value
else {
return Err(ShellError::NushellFailedSpanned {
msg: "Expected record".to_string(),
label: format!("got {}", value.get_type()),
span: value.span(),
});
};
let has_match = record.to_mut().iter_mut().try_fold(
false,
|acc: bool, (col, val)| -> Result<bool, ShellError> {
if !searched_cols.contains(&col.as_str()) {
// don't search this column
return Ok(acc);
}
let span = val.span();
if let Value::String { val: s, .. } = val {
if s.to_folded_case().contains(&search_string) {
*val = Value::string(
highlight_search_string(
s,
orig_search_string,
string_style,
highlight_style,
)?,
span,
);
return Ok(true);
}
}
// column does not contain the searched string
// ignore non-string values
Ok(acc)
},
)?;
if has_match {
matches.push(value);
}
}
Ok(matches)
}
// Highlight the search string using ANSI escape sequences and regular expressions.
pub fn highlight_search_string(
haystack: &str,
needle: &str,
string_style: &Style,
highlight_style: &Style,
) -> Result<String, ShellError> {
let escaped_needle = escape(needle);
let regex_string = format!("(?i){escaped_needle}");
let regex = match Regex::new(&regex_string) {
Ok(regex) => regex,
Err(err) => {
return Err(ShellError::GenericError {
error: "Could not compile regex".into(),
msg: err.to_string(),
span: Some(Span::test_data()),
help: None,
inner: vec![],
});
}
};
// strip haystack to remove existing ansi style
let stripped_haystack = nu_utils::strip_ansi_string_unlikely(haystack.to_string());
let mut last_match_end = 0;
let mut highlighted = String::new();
for cap in regex.captures_iter(stripped_haystack.as_ref()) {
match cap {
Ok(capture) => {
let start = match capture.get(0) {
Some(acap) => acap.start(),
None => 0,
};
let end = match capture.get(0) {
Some(acap) => acap.end(),
None => 0,
};
highlighted.push_str(
&string_style
.paint(&stripped_haystack[last_match_end..start])
.to_string(),
);
highlighted.push_str(
&highlight_style
.paint(&stripped_haystack[start..end])
.to_string(),
);
last_match_end = end;
}
Err(e) => {
return Err(ShellError::GenericError {
error: "Error with regular expression capture".into(),
msg: e.to_string(),
span: None,
help: None,
inner: vec![],
});
}
}
}
highlighted.push_str(
&string_style
.paint(&stripped_haystack[last_match_end..])
.to_string(),
);
Ok(highlighted)
}

View File

@ -1,6 +1,5 @@
use crate::help::highlight_search_in_table; use crate::filters::find_internal;
use nu_color_config::StyleComputer; use nu_engine::{command_prelude::*, get_full_help, scope::ScopeData};
use nu_engine::{command_prelude::*, scope::ScopeData};
#[derive(Clone)] #[derive(Clone)]
pub struct HelpAliases; pub struct HelpAliases;
@ -72,31 +71,20 @@ pub fn help_aliases(
let find: Option<Spanned<String>> = call.get_flag(engine_state, stack, "find")?; let find: Option<Spanned<String>> = call.get_flag(engine_state, stack, "find")?;
let rest: Vec<Spanned<String>> = call.rest(engine_state, stack, 0)?; let rest: Vec<Spanned<String>> = call.rest(engine_state, stack, 0)?;
// 🚩The following two-lines are copied from filters/find.rs:
let style_computer = StyleComputer::from_config(engine_state, stack);
// Currently, search results all use the same style.
// Also note that this sample string is passed into user-written code (the closure that may or may not be
// defined for "string").
let string_style = style_computer.compute("string", &Value::string("search result", head));
let highlight_style =
style_computer.compute("search_result", &Value::string("search result", head));
if let Some(f) = find { if let Some(f) = find {
let all_cmds_vec = build_help_aliases(engine_state, stack, head); let all_cmds_vec = build_help_aliases(engine_state, stack, head);
let found_cmds_vec = highlight_search_in_table( return find_internal(
all_cmds_vec, all_cmds_vec,
engine_state,
stack,
&f.item, &f.item,
&["name", "description"], &["name", "description"],
&string_style, true,
&highlight_style, );
)?;
return Ok(Value::list(found_cmds_vec, head).into_pipeline_data());
} }
if rest.is_empty() { if rest.is_empty() {
let found_cmds_vec = build_help_aliases(engine_state, stack, head); Ok(build_help_aliases(engine_state, stack, head))
Ok(Value::list(found_cmds_vec, head).into_pipeline_data())
} else { } else {
let mut name = String::new(); let mut name = String::new();
@ -113,50 +101,25 @@ pub fn help_aliases(
}); });
}; };
let Some(alias) = engine_state.get_decl(alias).as_alias() else { let alias = engine_state.get_decl(alias);
if alias.as_alias().is_none() {
return Err(ShellError::AliasNotFound { return Err(ShellError::AliasNotFound {
span: Span::merge_many(rest.iter().map(|s| s.span)), span: Span::merge_many(rest.iter().map(|s| s.span)),
}); });
}; };
let alias_expansion = let help = get_full_help(alias, engine_state, stack);
String::from_utf8_lossy(engine_state.get_span_contents(alias.wrapped_call.span));
let description = alias.description();
let extra_desc = alias.extra_description();
// TODO: merge this into documentation.rs at some point Ok(Value::string(help, call.head).into_pipeline_data())
const G: &str = "\x1b[32m"; // green
const C: &str = "\x1b[36m"; // cyan
const RESET: &str = "\x1b[0m"; // reset
let mut long_desc = String::new();
long_desc.push_str(description);
long_desc.push_str("\n\n");
if !extra_desc.is_empty() {
long_desc.push_str(extra_desc);
long_desc.push_str("\n\n");
}
long_desc.push_str(&format!("{G}Alias{RESET}: {C}{name}{RESET}"));
long_desc.push_str("\n\n");
long_desc.push_str(&format!("{G}Expansion{RESET}:\n {alias_expansion}"));
let config = stack.get_config(engine_state);
if !config.use_ansi_coloring.get(engine_state) {
long_desc = nu_utils::strip_ansi_string_likely(long_desc);
}
Ok(Value::string(long_desc, call.head).into_pipeline_data())
} }
} }
fn build_help_aliases(engine_state: &EngineState, stack: &Stack, span: Span) -> Vec<Value> { fn build_help_aliases(engine_state: &EngineState, stack: &Stack, span: Span) -> PipelineData {
let mut scope_data = ScopeData::new(engine_state, stack); let mut scope_data = ScopeData::new(engine_state, stack);
scope_data.populate_decls(); scope_data.populate_decls();
scope_data.collect_aliases(span) Value::list(scope_data.collect_aliases(span), span).into_pipeline_data()
} }
#[cfg(test)] #[cfg(test)]

View File

@ -1,5 +1,4 @@
use crate::help::highlight_search_in_table; use crate::filters::find_internal;
use nu_color_config::StyleComputer;
use nu_engine::{command_prelude::*, get_full_help}; use nu_engine::{command_prelude::*, get_full_help};
#[derive(Clone)] #[derive(Clone)]
@ -52,31 +51,20 @@ pub fn help_commands(
let find: Option<Spanned<String>> = call.get_flag(engine_state, stack, "find")?; let find: Option<Spanned<String>> = call.get_flag(engine_state, stack, "find")?;
let rest: Vec<Spanned<String>> = call.rest(engine_state, stack, 0)?; let rest: Vec<Spanned<String>> = call.rest(engine_state, stack, 0)?;
// 🚩The following two-lines are copied from filters/find.rs:
let style_computer = StyleComputer::from_config(engine_state, stack);
// Currently, search results all use the same style.
// Also note that this sample string is passed into user-written code (the closure that may or may not be
// defined for "string").
let string_style = style_computer.compute("string", &Value::string("search result", head));
let highlight_style =
style_computer.compute("search_result", &Value::string("search result", head));
if let Some(f) = find { if let Some(f) = find {
let all_cmds_vec = build_help_commands(engine_state, head); let all_cmds_vec = build_help_commands(engine_state, head);
let found_cmds_vec = highlight_search_in_table( return find_internal(
all_cmds_vec, all_cmds_vec,
engine_state,
stack,
&f.item, &f.item,
&["name", "description", "search_terms"], &["name", "description", "search_terms"],
&string_style, true,
&highlight_style, );
)?;
return Ok(Value::list(found_cmds_vec, head).into_pipeline_data());
} }
if rest.is_empty() { if rest.is_empty() {
let found_cmds_vec = build_help_commands(engine_state, head); Ok(build_help_commands(engine_state, head))
Ok(Value::list(found_cmds_vec, head).into_pipeline_data())
} else { } else {
let mut name = String::new(); let mut name = String::new();
@ -99,7 +87,7 @@ pub fn help_commands(
} }
} }
fn build_help_commands(engine_state: &EngineState, span: Span) -> Vec<Value> { fn build_help_commands(engine_state: &EngineState, span: Span) -> PipelineData {
let commands = engine_state.get_decls_sorted(false); let commands = engine_state.get_decls_sorted(false);
let mut found_cmds_vec = Vec::new(); let mut found_cmds_vec = Vec::new();
@ -156,7 +144,7 @@ fn build_help_commands(engine_state: &EngineState, span: Span) -> Vec<Value> {
for named_param in &sig.named { for named_param in &sig.named {
let name = if let Some(short) = named_param.short { let name = if let Some(short) = named_param.short {
if named_param.long.is_empty() { if named_param.long.is_empty() {
format!("-{}", short) format!("-{short}")
} else { } else {
format!("--{}(-{})", named_param.long, short) format!("--{}(-{})", named_param.long, short)
} }
@ -215,7 +203,7 @@ fn build_help_commands(engine_state: &EngineState, span: Span) -> Vec<Value> {
found_cmds_vec.push(Value::record(record, span)); found_cmds_vec.push(Value::record(record, span));
} }
found_cmds_vec Value::list(found_cmds_vec, span).into_pipeline_data()
} }
#[cfg(test)] #[cfg(test)]

View File

@ -1,5 +1,4 @@
use crate::help::highlight_search_in_table; use crate::filters::find_internal;
use nu_color_config::StyleComputer;
use nu_engine::{command_prelude::*, get_full_help, scope::ScopeData}; use nu_engine::{command_prelude::*, get_full_help, scope::ScopeData};
#[derive(Clone)] #[derive(Clone)]
@ -72,31 +71,20 @@ pub fn help_externs(
let find: Option<Spanned<String>> = call.get_flag(engine_state, stack, "find")?; let find: Option<Spanned<String>> = call.get_flag(engine_state, stack, "find")?;
let rest: Vec<Spanned<String>> = call.rest(engine_state, stack, 0)?; let rest: Vec<Spanned<String>> = call.rest(engine_state, stack, 0)?;
// 🚩The following two-lines are copied from filters/find.rs:
let style_computer = StyleComputer::from_config(engine_state, stack);
// Currently, search results all use the same style.
// Also note that this sample string is passed into user-written code (the closure that may or may not be
// defined for "string").
let string_style = style_computer.compute("string", &Value::string("search result", head));
let highlight_style =
style_computer.compute("search_result", &Value::string("search result", head));
if let Some(f) = find { if let Some(f) = find {
let all_cmds_vec = build_help_externs(engine_state, stack, head); let all_cmds_vec = build_help_externs(engine_state, stack, head);
let found_cmds_vec = highlight_search_in_table( return find_internal(
all_cmds_vec, all_cmds_vec,
engine_state,
stack,
&f.item, &f.item,
&["name", "description"], &["name", "description"],
&string_style, true,
&highlight_style, );
)?;
return Ok(Value::list(found_cmds_vec, head).into_pipeline_data());
} }
if rest.is_empty() { if rest.is_empty() {
let found_cmds_vec = build_help_externs(engine_state, stack, head); Ok(build_help_externs(engine_state, stack, head))
Ok(Value::list(found_cmds_vec, head).into_pipeline_data())
} else { } else {
let mut name = String::new(); let mut name = String::new();
@ -119,10 +107,10 @@ pub fn help_externs(
} }
} }
fn build_help_externs(engine_state: &EngineState, stack: &Stack, span: Span) -> Vec<Value> { fn build_help_externs(engine_state: &EngineState, stack: &Stack, span: Span) -> PipelineData {
let mut scope = ScopeData::new(engine_state, stack); let mut scope = ScopeData::new(engine_state, stack);
scope.populate_decls(); scope.populate_decls();
scope.collect_externs(span) Value::list(scope.collect_externs(span), span).into_pipeline_data()
} }
#[cfg(test)] #[cfg(test)]

View File

@ -1,5 +1,4 @@
use crate::help::highlight_search_in_table; use crate::filters::find_internal;
use nu_color_config::StyleComputer;
use nu_engine::{command_prelude::*, scope::ScopeData}; use nu_engine::{command_prelude::*, scope::ScopeData};
use nu_protocol::DeclId; use nu_protocol::DeclId;
@ -79,31 +78,20 @@ pub fn help_modules(
let find: Option<Spanned<String>> = call.get_flag(engine_state, stack, "find")?; let find: Option<Spanned<String>> = call.get_flag(engine_state, stack, "find")?;
let rest: Vec<Spanned<String>> = call.rest(engine_state, stack, 0)?; let rest: Vec<Spanned<String>> = call.rest(engine_state, stack, 0)?;
// 🚩The following two-lines are copied from filters/find.rs:
let style_computer = StyleComputer::from_config(engine_state, stack);
// Currently, search results all use the same style.
// Also note that this sample string is passed into user-written code (the closure that may or may not be
// defined for "string").
let string_style = style_computer.compute("string", &Value::string("search result", head));
let highlight_style =
style_computer.compute("search_result", &Value::string("search result", head));
if let Some(f) = find { if let Some(f) = find {
let all_cmds_vec = build_help_modules(engine_state, stack, head); let all_cmds_vec = build_help_modules(engine_state, stack, head);
let found_cmds_vec = highlight_search_in_table( return find_internal(
all_cmds_vec, all_cmds_vec,
engine_state,
stack,
&f.item, &f.item,
&["name", "description"], &["name", "description"],
&string_style, true,
&highlight_style, );
)?;
return Ok(Value::list(found_cmds_vec, head).into_pipeline_data());
} }
if rest.is_empty() { if rest.is_empty() {
let found_cmds_vec = build_help_modules(engine_state, stack, head); Ok(build_help_modules(engine_state, stack, head))
Ok(Value::list(found_cmds_vec, head).into_pipeline_data())
} else { } else {
let mut name = String::new(); let mut name = String::new();
@ -239,11 +227,11 @@ pub fn help_modules(
} }
} }
fn build_help_modules(engine_state: &EngineState, stack: &Stack, span: Span) -> Vec<Value> { fn build_help_modules(engine_state: &EngineState, stack: &Stack, span: Span) -> PipelineData {
let mut scope_data = ScopeData::new(engine_state, stack); let mut scope_data = ScopeData::new(engine_state, stack);
scope_data.populate_modules(); scope_data.populate_modules();
scope_data.collect_modules(span) Value::list(scope_data.collect_modules(span), span).into_pipeline_data()
} }
#[cfg(test)] #[cfg(test)]

View File

@ -68,6 +68,29 @@ impl Command for HelpOperators {
] ]
.into_iter() .into_iter()
.map(|op| { .map(|op| {
if op == Operator::Comparison(Comparison::RegexMatch) {
Value::record(
record! {
"type" => Value::string(op_type(&op), head),
"operator" => Value::string("=~, like", head),
"name" => Value::string(name(&op), head),
"description" => Value::string(description(&op), head),
"precedence" => Value::int(op.precedence().into(), head),
},
head,
)
} else if op == Operator::Comparison(Comparison::NotRegexMatch) {
Value::record(
record! {
"type" => Value::string(op_type(&op), head),
"operator" => Value::string("!~, not-like", head),
"name" => Value::string(name(&op), head),
"description" => Value::string(description(&op), head),
"precedence" => Value::int(op.precedence().into(), head),
},
head,
)
} else {
Value::record( Value::record(
record! { record! {
"type" => Value::string(op_type(&op), head), "type" => Value::string(op_type(&op), head),
@ -78,6 +101,7 @@ impl Command for HelpOperators {
}, },
head, head,
) )
}
}) })
.collect::<Vec<_>>(); .collect::<Vec<_>>();

View File

@ -16,7 +16,6 @@ pub use help_modules::HelpModules;
pub use help_operators::HelpOperators; pub use help_operators::HelpOperators;
pub use help_pipe_and_redirect::HelpPipeAndRedirect; pub use help_pipe_and_redirect::HelpPipeAndRedirect;
pub(crate) use help_::highlight_search_in_table;
pub(crate) use help_aliases::help_aliases; pub(crate) use help_aliases::help_aliases;
pub(crate) use help_commands::help_commands; pub(crate) use help_commands::help_commands;
pub(crate) use help_modules::help_modules; pub(crate) use help_modules::help_modules;

View File

@ -80,9 +80,9 @@ pub fn http_parse_url(
) -> Result<(String, Url), ShellError> { ) -> Result<(String, Url), ShellError> {
let mut requested_url = raw_url.coerce_into_string()?; let mut requested_url = raw_url.coerce_into_string()?;
if requested_url.starts_with(':') { if requested_url.starts_with(':') {
requested_url = format!("http://localhost{}", requested_url); requested_url = format!("http://localhost{requested_url}");
} else if !requested_url.contains("://") { } else if !requested_url.contains("://") {
requested_url = format!("http://{}", requested_url); requested_url = format!("http://{requested_url}");
} }
let url = match url::Url::parse(&requested_url) { let url = match url::Url::parse(&requested_url) {
@ -382,8 +382,7 @@ fn send_multipart_request(
"Content-Type: application/octet-stream".to_string(), "Content-Type: application/octet-stream".to_string(),
"Content-Transfer-Encoding: binary".to_string(), "Content-Transfer-Encoding: binary".to_string(),
format!( format!(
"Content-Disposition: form-data; name=\"{}\"; filename=\"{}\"", "Content-Disposition: form-data; name=\"{col}\"; filename=\"{col}\""
col, col
), ),
format!("Content-Length: {}", val.len()), format!("Content-Length: {}", val.len()),
]; ];
@ -391,7 +390,7 @@ fn send_multipart_request(
.add(&mut Cursor::new(val), &headers.join("\r\n")) .add(&mut Cursor::new(val), &headers.join("\r\n"))
.map_err(err)?; .map_err(err)?;
} else { } else {
let headers = format!(r#"Content-Disposition: form-data; name="{}""#, col); let headers = format!(r#"Content-Disposition: form-data; name="{col}""#);
builder builder
.add(val.coerce_into_string()?.as_bytes(), &headers) .add(val.coerce_into_string()?.as_bytes(), &headers)
.map_err(err)?; .map_err(err)?;
@ -400,7 +399,7 @@ fn send_multipart_request(
builder.finish(); builder.finish();
let (boundary, data) = (builder.boundary, builder.data); let (boundary, data) = (builder.boundary, builder.data);
let content_type = format!("multipart/form-data; boundary={}", boundary); let content_type = format!("multipart/form-data; boundary={boundary}");
move || req.set("Content-Type", &content_type).send_bytes(&data) move || req.set("Content-Type", &content_type).send_bytes(&data)
} }
@ -703,8 +702,7 @@ fn transform_response_using_content_type(
.expect("Failed to parse content type, and failed to default to text/plain"); .expect("Failed to parse content type, and failed to default to text/plain");
let ext = match (content_type.type_(), content_type.subtype()) { let ext = match (content_type.type_(), content_type.subtype()) {
(mime::TEXT, mime::PLAIN) => { (mime::TEXT, mime::PLAIN) => url::Url::parse(requested_url)
let path_extension = url::Url::parse(requested_url)
.map_err(|err| { .map_err(|err| {
LabeledError::new(err.to_string()) LabeledError::new(err.to_string())
.with_help("cannot parse") .with_help("cannot parse")
@ -720,9 +718,7 @@ fn transform_response_using_content_type(
PathBuf::from(name) PathBuf::from(name)
.extension() .extension()
.map(|name| name.to_string_lossy().to_string()) .map(|name| name.to_string_lossy().to_string())
}); }),
path_extension
}
_ => Some(content_type.subtype().to_string()), _ => Some(content_type.subtype().to_string()),
}; };

View File

@ -141,7 +141,7 @@ struct Arguments {
redirect: Option<Spanned<String>>, redirect: Option<Spanned<String>>,
} }
fn run_get( pub fn run_get(
engine_state: &EngineState, engine_state: &EngineState,
stack: &mut Stack, stack: &mut Stack,
call: &Call, call: &Call,

View File

@ -1,5 +1,8 @@
use nu_engine::{command_prelude::*, get_full_help}; use nu_engine::{command_prelude::*, get_full_help};
use super::get::run_get;
use super::post::run_post;
#[derive(Clone)] #[derive(Clone)]
pub struct Http; pub struct Http;
@ -10,7 +13,76 @@ impl Command for Http {
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("http") Signature::build("http")
.input_output_types(vec![(Type::Nothing, Type::String)]) .input_output_types(vec![(Type::Nothing, Type::Any)])
// common to get more than help. Get by default
.optional(
"URL",
SyntaxShape::String,
"The URL to fetch the contents from.",
)
// post
.optional(
"data",
SyntaxShape::Any,
"The contents of the post body. Required unless part of a pipeline.",
)
.named(
"content-type",
SyntaxShape::Any,
"the MIME type of content to post",
Some('t'),
)
// common
.named(
"user",
SyntaxShape::Any,
"the username when authenticating",
Some('u'),
)
.named(
"password",
SyntaxShape::Any,
"the password when authenticating",
Some('p'),
)
.named(
"max-time",
SyntaxShape::Duration,
"max duration before timeout occurs",
Some('m'),
)
.named(
"headers",
SyntaxShape::Any,
"custom headers you want to add ",
Some('H'),
)
.switch(
"raw",
"fetch contents as text rather than a table",
Some('r'),
)
.switch(
"insecure",
"allow insecure server connections when using SSL",
Some('k'),
)
.switch(
"full",
"returns the full response instead of only the body",
Some('f'),
)
.switch(
"allow-errors",
"do not fail if the server returns an error code",
Some('e'),
)
.named(
"redirect-mode",
SyntaxShape::String,
"What to do when encountering redirects. Default: 'follow'. Valid options: 'follow' ('f'), 'manual' ('m'), 'error' ('e').",
Some('R')
)
.category(Category::Network) .category(Category::Network)
} }
@ -19,7 +91,7 @@ impl Command for Http {
} }
fn extra_description(&self) -> &str { fn extra_description(&self) -> &str {
"You must use one of the following subcommands. Using this command as-is will only produce this help message." "Without a subcommand but with a URL provided, it performs a GET request by default or a POST request if data is provided. You can use one of the following subcommands. Using this command as-is will only display this help message."
} }
fn search_terms(&self) -> Vec<&str> { fn search_terms(&self) -> Vec<&str> {
@ -33,8 +105,41 @@ impl Command for Http {
engine_state: &EngineState, engine_state: &EngineState,
stack: &mut Stack, stack: &mut Stack,
call: &Call, call: &Call,
_input: PipelineData, input: PipelineData,
) -> Result<PipelineData, ShellError> { ) -> Result<PipelineData, ShellError> {
Ok(Value::string(get_full_help(self, engine_state, stack), call.head).into_pipeline_data()) let url = call.opt::<Value>(engine_state, stack, 0)?;
let data = call.opt::<Value>(engine_state, stack, 1)?;
match (url.is_some(), data.is_some()) {
(true, true) => run_post(engine_state, stack, call, input),
(true, false) => run_get(engine_state, stack, call, input),
(false, true) => Err(ShellError::NushellFailed {
msg: (String::from("Default verb is get with a payload. Impossible state")),
}),
(false, false) => Ok(Value::string(
get_full_help(self, engine_state, stack),
call.head,
)
.into_pipeline_data()),
}
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Get content from example.com with default verb",
example: "http https://www.example.com",
result: None,
},
Example {
description: "Post content to example.com with default verb",
example: "http https://www.example.com 'body'",
result: None,
},
Example {
description: "Get content from example.com with explicit verb",
example: "http get https://www.example.com",
result: None,
},
]
} }
} }

View File

@ -161,7 +161,7 @@ struct Arguments {
redirect: Option<Spanned<String>>, redirect: Option<Spanned<String>>,
} }
fn run_post( pub fn run_post(
engine_state: &EngineState, engine_state: &EngineState,
stack: &mut Stack, stack: &mut Stack,
call: &Call, call: &Call,

View File

@ -64,7 +64,7 @@ impl Registry for NuShellNightly {
tag_name: String, tag_name: String,
} }
let url = format!("https://api.github.com/repos/{}/releases", pkg); let url = format!("https://api.github.com/repos/{pkg}/releases");
let versions = http_client let versions = http_client
.add_header("Accept", "application/vnd.github.v3+json") .add_header("Accept", "application/vnd.github.v3+json")
.add_header("User-Agent", "update-informer") .add_header("User-Agent", "update-informer")
@ -128,7 +128,7 @@ pub fn check_for_latest_nushell_version() -> Value {
if let Ok(Some(new_version)) = informer.check_version() { if let Ok(Some(new_version)) = informer.check_version() {
rec.push("current", Value::test_bool(false)); rec.push("current", Value::test_bool(false));
rec.push("latest", Value::test_string(format!("{}", new_version))); rec.push("latest", Value::test_string(format!("{new_version}")));
Value::test_record(rec) Value::test_record(rec)
} else { } else {
rec.push("current", Value::test_bool(true)); rec.push("current", Value::test_bool(true));
@ -148,7 +148,7 @@ pub fn check_for_latest_nushell_version() -> Value {
if let Ok(Some(new_version)) = informer.check_version() { if let Ok(Some(new_version)) = informer.check_version() {
rec.push("current", Value::test_bool(false)); rec.push("current", Value::test_bool(false));
rec.push("latest", Value::test_string(format!("{}", new_version))); rec.push("latest", Value::test_string(format!("{new_version}")));
Value::test_record(rec) Value::test_record(rec)
} else { } else {
rec.push("current", Value::test_bool(true)); rec.push("current", Value::test_bool(true));

View File

@ -208,7 +208,7 @@ impl EventTypeFilter {
fn wrong_type_error(head: Span, val: &str, val_span: Span) -> ShellError { fn wrong_type_error(head: Span, val: &str, val_span: Span) -> ShellError {
ShellError::UnsupportedInput { ShellError::UnsupportedInput {
msg: format!("{} is not a valid event type", val), msg: format!("{val} is not a valid event type"),
input: "value originates from here".into(), input: "value originates from here".into(),
msg_span: head, msg_span: head,
input_span: val_span, input_span: val_span,

View File

@ -1,4 +1,3 @@
mod ansi;
mod clear; mod clear;
mod dir_info; mod dir_info;
mod input; mod input;
@ -10,7 +9,6 @@ mod term;
mod ulimit; mod ulimit;
mod whoami; mod whoami;
pub use ansi::{Ansi, AnsiLink, AnsiStrip};
pub use clear::Clear; pub use clear::Clear;
pub use dir_info::{DirBuilder, DirInfo, FileInfo}; pub use dir_info::{DirBuilder, DirInfo, FileInfo};
pub use input::Input; pub use input::Input;

View File

@ -1,6 +1,7 @@
use nu_engine::command_prelude::*; use nu_engine::command_prelude::*;
use nu_protocol::ListStream; use nu_protocol::ListStream;
use rand::random_range; use rand::random_range;
use std::num::NonZeroUsize;
#[derive(Clone)] #[derive(Clone)]
pub struct RandomDice; pub struct RandomDice;
@ -70,8 +71,16 @@ fn dice(
) -> Result<PipelineData, ShellError> { ) -> Result<PipelineData, ShellError> {
let span = call.head; let span = call.head;
let dice: usize = call.get_flag(engine_state, stack, "dice")?.unwrap_or(1); let sides: NonZeroUsize = call
let sides: usize = call.get_flag(engine_state, stack, "sides")?.unwrap_or(6); .get_flag(engine_state, stack, "sides")?
.unwrap_or_else(|| NonZeroUsize::new(6).expect("default sides must be non-zero"));
let dice: NonZeroUsize = call
.get_flag(engine_state, stack, "dice")?
.unwrap_or_else(|| NonZeroUsize::new(1).expect("default dice count must be non-zero"));
let sides = sides.get();
let dice = dice.get();
let iter = (0..dice).map(move |_| Value::int(random_range(1..sides + 1) as i64, span)); let iter = (0..dice).map(move |_| Value::int(random_range(1..sides + 1) as i64, span));

View File

@ -135,8 +135,7 @@ fn uuid(
_ => { _ => {
return Err(ShellError::IncorrectValue { return Err(ShellError::IncorrectValue {
msg: format!( msg: format!(
"Unsupported UUID version: {}. Supported versions are 1, 3, 4, 5, and 7.", "Unsupported UUID version: {version}. Supported versions are 1, 3, 4, 5, and 7."
version
), ),
val_span: span, val_span: span,
call_span: span, call_span: span,
@ -190,8 +189,7 @@ fn validate_flags(
if v != 4 && v != 7 { if v != 4 && v != 7 {
return Err(ShellError::IncorrectValue { return Err(ShellError::IncorrectValue {
msg: format!( msg: format!(
"Unsupported UUID version: {}. Supported versions are 1, 3, 4, 5, and 7.", "Unsupported UUID version: {v}. Supported versions are 1, 3, 4, 5, and 7."
v
), ),
val_span: span, val_span: span,
call_span: span, call_span: span,
@ -202,7 +200,7 @@ fn validate_flags(
.is_some() .is_some()
{ {
return Err(ShellError::IncompatibleParametersSingle { return Err(ShellError::IncompatibleParametersSingle {
msg: format!("version {} uuid does not take mac as a parameter", v), msg: format!("version {v} uuid does not take mac as a parameter"),
span, span,
}); });
} }
@ -211,7 +209,7 @@ fn validate_flags(
.is_some() .is_some()
{ {
return Err(ShellError::IncompatibleParametersSingle { return Err(ShellError::IncompatibleParametersSingle {
msg: format!("version {} uuid does not take namespace as a parameter", v), msg: format!("version {v} uuid does not take namespace as a parameter"),
span, span,
}); });
} }
@ -220,7 +218,7 @@ fn validate_flags(
.is_some() .is_some()
{ {
return Err(ShellError::IncompatibleParametersSingle { return Err(ShellError::IncompatibleParametersSingle {
msg: format!("version {} uuid does not take name as a parameter", v), msg: format!("version {v} uuid does not take name as a parameter"),
span, span,
}); });
} }

Some files were not shown because too many files have changed in this diff Show More