# Description
Bug fix: `PipelineData::check_external_failed()` was not preserving the
original `type_` and `known_size` attributes of the stream passed in for
streams that come from children, so `external-command | into binary` did
not work properly and always ended up still being unknown type.
# User-Facing Changes
The following test case now works as expected:
```nushell
> head -c 2 /dev/urandom | into binary
# Expected: pretty hex dump of binary
# Previous behavior: just raw binary in the terminal
```
# Tests + Formatting
Added a test to cover this to `into binary`
This reverts commit 0cfd5fbece.
The original PR messed up syntax higlighting of aliases and causes
panics of completion in the presence of alias.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Part of https://github.com/nushell/nushell/issues/12963, step 2.
This PR refactors Call and related argument structures to remove their
dependency on `Expression::span` which will be removed in the future.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Should be none. If you see some error messages that look broken, please
report.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Provides the ability to use http commands as part of a pipeline.
Additionally, this pull requests extends the pipeline metadata to add a
content_type field. The content_type metadata field allows commands such
as `to json` to set the metadata in the pipeline allowing the http
commands to use it when making requests.
This pull request also introduces the ability to directly stream http
requests from streaming pipelines.
One other small change is that Content-Type will always be set if it is
passed in to the http commands, either indirectly or throw the content
type flag. Previously it was not preserved with requests that were not
of type json or form data.
# User-Facing Changes
* `http post`, `http put`, `http patch`, `http delete` can be used as
part of a pipeline
* `to text`, `to json`, `from json` all set the content_type metadata
field and the http commands will utilize them when making requests.
# Description
Fixes: https://github.com/nushell/nushell/issues/12099
Currently if user run `use voice.nu`, and file is unchanged, then run
`use voice.nu` again. nushell will use the module directly, even if
submodule inside `voice.nu` is changed.
After discussed with @kubouch, I think it's ok to re-parse the module
file when:
1. It exports sub modules which are defined by a file
2. It uses other modules which are defined by a file
## About the change:
To achieve the behavior, we need to add 2 attributes to `Module`:
1. `imported_modules`: it tracks the other modules is imported by the
givem `module`, e.g: `use foo.nu`
2. `file`: the path of a module, if a module is defined by a file, it
will be `Some(path)`, or else it will be `None`.
After the change:
use voice.nu always read the file and parse it.
use voice will still use the module which is saved in EngineState.
# User-Facing Changes
use `xxx.nu` will read the file and parse it if it exports submodules or
uses submodules
# Tests + Formatting
Done
---------
Co-authored-by: Jakub Žádník <kubouch@gmail.com>
# Description
This PR implements script or module autoloading. It does this by finding
the `$nu.vendor-autoload-dir`, lists the contents and sorts them by file
name. These files are evaluated in that order.
To see what's going on, you can use `--log-level warn`
```
❯ cargo r -- --log-level warn
Finished dev [unoptimized + debuginfo] target(s) in 0.58s
Running `target\debug\nu.exe --log-level warn`
2024-06-24 09:23:20.494 PM [WARN ] nu::config_files: set_config_path() cwd: "C:\\Users\\fdncred\\source\\repos\\nushell", default_config: config.nu, key: config-path, config_file_specified: None
2024-06-24 09:23:20.495 PM [WARN ] nu::config_files: set_config_path() cwd: "C:\\Users\\fdncred\\source\\repos\\nushell", default_config: env.nu, key: env-path, config_file_specified: None
2024-06-24 09:23:20.629 PM [WARN ] nu::config_files: setup_config() config_file_specified: None, env_file_specified: None, login: false
2024-06-24 09:23:20.660 PM [WARN ] nu::config_files: read_config_file() config_file_specified: None, is_env_config: true
Hello, from env.nu
2024-06-24 09:23:20.679 PM [WARN ] nu::config_files: read_config_file() config_file_specified: None, is_env_config: false
Hello, from config.nu
Hello, from defs.nu
Activating Microsoft Visual Studio environment.
2024-06-24 09:23:21.398 PM [WARN ] nu::config_files: read_vendor_autoload_files() src\config_files.rs:234:9
2024-06-24 09:23:21.399 PM [WARN ] nu::config_files: read_vendor_autoload_files: C:\ProgramData\nushell\vendor\autoload
2024-06-24 09:23:21.399 PM [WARN ] nu::config_files: AutoLoading: "C:\\ProgramData\\nushell\\vendor\\autoload\\01_get-weather.nu"
2024-06-24 09:23:21.675 PM [WARN ] nu::config_files: AutoLoading: "C:\\ProgramData\\nushell\\vendor\\autoload\\02_temp.nu"
2024-06-24 09:23:21.817 PM [WARN ] nu_cli::repl: Terminal doesn't support use_kitty_protocol config
```
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
Fixesnushell/nushell#13207
# Description
This fixes the parsing of command usage when that command comes from a
file with CRLF line endings.
See nushell/nushell#13207 for more details.
# User-Facing Changes
Users on Windows will get correct autocompletion for `std` commands.
# Description
In #13031 I added the derive macros for `FromValue` and `IntoValue`. In
that implementation, in particular for structs with named fields, it was
not possible to omit fields while loading them from a value, when the
field is an `Option`. This PR adds extra handling for this behavior, so
if a field is an `Option` and that field is missing in the `Value`, then
the field becomes `None`. This behavior is also tested in
`nu_protocol::value::test_derive::missing_options`.
# User-Facing Changes
When using structs for options or similar, users can now just emit
fields in the record and the derive `from_value` method will be able to
understand this, if the struct has an `Option` type for that field.
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
A showcase for this feature would be great, I tried to use the current
derive macro in a plugin of mine for a config but without this addition,
they are annoying to use. So, when this is done, I would add an example
for such plugin configs that may be loaded via `FromValue`.
# Description
This allows plugins to report their version (and potentially other
metadata in the future). The version is shown in `plugin list` and in
`version`.
The metadata is stored in the registry file, and reflects whatever was
retrieved on `plugin add`, not necessarily the running binary. This can
help you to diagnose if there's some kind of mismatch with what you
expect. We could potentially use this functionality to show a warning or
error if a plugin being run does not have the same version as what was
in the cache file, suggesting `plugin add` be run again, but I haven't
done that at this point.
It is optional, and it requires the plugin author to make some code
changes if they want to provide it, since I can't automatically
determine the version of the calling crate or anything tricky like that
to do it.
Example:
```
> plugin list | select name version is_running pid
╭───┬────────────────┬─────────┬────────────┬─────╮
│ # │ name │ version │ is_running │ pid │
├───┼────────────────┼─────────┼────────────┼─────┤
│ 0 │ example │ 0.93.1 │ false │ │
│ 1 │ gstat │ 0.93.1 │ false │ │
│ 2 │ inc │ 0.93.1 │ false │ │
│ 3 │ python_example │ 0.1.0 │ false │ │
╰───┴────────────────┴─────────┴────────────┴─────╯
```
cc @maxim-uvarov (he asked for it)
# User-Facing Changes
- `plugin list` gets a `version` column
- `version` shows plugin versions when available
- plugin authors *should* add `fn metadata()` to their `impl Plugin`,
but don't have to
# Tests + Formatting
Tested the low level stuff and also the `plugin list` column.
# After Submitting
- [ ] update plugin guide docs
- [ ] update plugin protocol docs (`Metadata` call & response)
- [ ] update plugin template (`fn metadata()` should be easy)
- [ ] release notes
# Description
This PR adds a directory to the `$nu` constant that shows where the
system level autoload directory is located at. This folder is modifiable
at compile time with environment variables.
```rust
// Create a system level directory for nushell scripts, modules, completions, etc
// that can be changed by setting the NU_VENDOR_AUTOLOAD_DIR env var on any platform
// before nushell is compiled OR if NU_VENDOR_AUTOLOAD_DIR is not set for non-windows
// systems, the PREFIX env var can be set before compile and used as PREFIX/nushell/vendor/autoload
// pseudo code
// if env var NU_VENDOR_AUTOLOAD_DIR is set, in any platform, use it
// if not, if windows, use ALLUSERPROFILE\nushell\vendor\autoload
// if not, if non-windows, if env var PREFIX is set, use PREFIX/share/nushell/vendor/autoload
// if not, use the default /usr/share/nushell/vendor/autoload
```
### Windows default
```nushell
❯ $nu.vendor-autoload-dir
C:\ProgramData\nushell\vendor\autoload
```
### Non-Windows default
```nushell
❯ $nu.vendor-autoload-dir
/usr/local/share/nushell/vendor/autoload
```
### Non-Windows with PREFIX set
```nushell
❯ PREFIX=/usr/bob cargo r
❯ $nu.vendor-autoload-dir
/usr/bob/share/nushell/vendor/autoload
```
### Non-Windows with NU_VENDOR_AUTOLOAD_DIR set
```nushell
❯ NU_VENDOR_AUTOLOAD_DIR=/some/other/path/nushell/stuff cargo r
❯ $nu.vendor-autoload-dir
/some/other/path/nushell/stuff
```
> [!IMPORTANT]
To be clear, this PR does not do the auto-loading, it just sets up the
folder to support that functionality that can be added in a later PR.
The PR also does not create the folder defined. It's just setting the
$nu constant.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
We've had a lot of different issues and PRs related to arg handling with
externals since the rewrite of `run-external` in #12921:
- #12950
- #12955
- #13000
- #13001
- #13021
- #13027
- #13028
- #13073
Many of these are caused by the argument handling of external calls and
`run-external` being very special and involving the parser handing
quoted strings over to `run-external` so that it knows whether to expand
tildes and globs and so on. This is really unusual and also makes it
harder to use `run-external`, and also harder to understand it (and
probably is part of the reason why it was rewritten in the first place).
This PR moves a lot more of that work over to the parser, so that by the
time `run-external` gets it, it's dealing with much more normal Nushell
values. In particular:
- Unquoted strings are handled as globs with no expand
- The unescaped-but-quoted handling of strings was removed, and the
parser constructs normal looking strings instead, removing internal
quotes so that `run-external` doesn't have to do it
- Bare word interpolation is now supported and expansion is done in this
case
- Expressions typed as `Glob` containing `Expr::StringInterpolation` now
produce `Value::Glob` instead, with the quoted status from the expr
passed through so we know if it was a bare word
- Bare word interpolation for values typed as `glob` now possible, but
not implemented
- Because expansion is now triggered by `Value::Glob(_, false)` instead
of looking at the expr, externals now support glob types
# User-Facing Changes
- Bare word interpolation works for external command options, and
otherwise embedded in other strings:
```nushell
^echo --foo=(2 + 2) # prints --foo=4
^echo -foo=$"(2 + 2)" # prints -foo=4
^echo foo="(2 + 2)" # prints (no interpolation!) foo=(2 + 2)
^echo foo,(2 + 2),bar # prints foo,4,bar
```
- Bare word interpolation expands for external command head/args:
```nushell
let name = "exa"
~/.cargo/bin/($name) # this works, and expands the tilde
^$"~/.cargo/bin/($name)" # this doesn't expand the tilde
^echo ~/($name)/* # this glob is expanded
^echo $"~/($name)/*" # this isn't expanded
```
- Ndots are now supported for the head of an external command
(`^.../foo` works)
- Glob values are now supported for head/args of an external command,
and expanded appropriately:
```nushell
^("~/.cargo/bin/exa" | into glob) # the tilde is expanded
^echo ("*.txt" | into glob) # this glob is expanded
```
- `run-external` now works more like any other command, without
expecting a special call convention
for its args:
```nushell
run-external echo "'foo'"
# before PR: 'foo'
# after PR: foo
run-external echo "*.txt"
# before PR: (glob is expanded)
# after PR: *.txt
```
# Tests + Formatting
Lots of tests added and cleaned up. Some tests that weren't active on
Windows changed to use `nu --testbin cococo` so that they can work.
Added a test for Linux only to make sure tilde expansion of commands
works, because changing `HOME` there causes `~` to reliably change.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] release notes: make sure to mention the new syntaxes that are
supported
# Description
After discussing with @sholderbach the cumbersome usage of
`nu_protocol::Value` in Rust, I created a derive macro to simplify it.
I’ve added a new crate called `nu-derive-value`, which includes two
macros, `IntoValue` and `FromValue`. These are re-exported in
`nu-protocol` and should be encouraged to be used via that re-export.
The macros ensure that all types can easily convert from and into
`Value`. For example, as a plugin author, you can define your plugin
configuration using a Rust struct and easily convert it using
`FromValue`. This makes plugin configuration less of a hassle.
I introduced the `IntoValue` trait for a standardized approach to
converting values into `Value` (and a fallible variant `TryIntoValue`).
This trait could potentially replace existing `into_value` methods.
Along with this, I've implemented `FromValue` for several standard types
and refined other implementations to use blanket implementations where
applicable.
I made these design choices with input from @devyn.
There are more improvements possible, but this is a solid start and the
PR is already quite substantial.
# User-Facing Changes
For `nu-protocol` users, these changes simplify the handling of
`Value`s. There are no changes for end-users of nushell itself.
# Tests + Formatting
Documenting the macros itself is not really possible, as they cannot
really reference any other types since they are the root of the
dependency graph. The standard library has the same problem
([std::Debug](https://doc.rust-lang.org/stable/std/fmt/derive.Debug.html)).
However I documented the `FromValue` and `IntoValue` traits completely.
For testing, I made of use `proc-macro2` in the derive macro code. This
would allow testing the generated source code. Instead I just tested
that the derived functionality is correct. This is done in
`nu_protocol::value::test_derive`, as a consumer of `nu-derive-value`
needs to do the testing of the macro usage. I think that these tests
should provide a stable baseline so that users can be sure that the impl
works.
# After Submitting
With these macros available, we can probably use them in some examples
for plugins to showcase the use of them.
# Description
This PR is an attempt to add a standard location for people to put
completions in. I saw this topic come up again recently and IIRC we
decided to create a standard location. I used the dirs-next crate to
dictate where these locations are. I know some people won't like that
but at least this gets the ball rolling in a direction that has a
standard directory.
This is what the default NU_LIB_DIRS looks like now in the
default_env.nu. It should also be like this when starting nushell with
`nu -n`
```nushell
$env.NU_LIB_DIRS = [
($nu.default-config-dir | path join 'scripts') # add <nushell-config-dir>/scripts
($nu.data-dir | path join 'completions') # default home for nushell completions
]
```
I also added these default folders to the `$nu` variable so now there is
`$nu.data-path` and `$nu.cache-path`.
## Data Dir Default
![image](https://github.com/nushell/nushell/assets/343840/aeeb7cd6-17b4-43e8-bb6f-986a0c7fce23)
While I was in there, I also decided to add a cache dir
## Cache Dir Default
![image](https://github.com/nushell/nushell/assets/343840/87dead66-4911-4f67-bfb2-acb16f386674)
### This is what the default looks like in Ubuntu.
![image](https://github.com/nushell/nushell/assets/343840/bca8eae8-8c18-47e8-b64f-3efe34f0004f)
### This is what it looks like with XDG_CACHE_HOME and XDG_DATA_HOME
overridden
```nushell
XDG_DATA_HOME=/tmp/data_home XDG_CACHE_HOME=/tmp/cache_home cargo r
```
![image](https://github.com/nushell/nushell/assets/343840/fae86d50-9821-41f1-868e-3814eca3730b)
### This is what the defaults look like in Windows (username scrubbed to
protect the innocent)
![image](https://github.com/nushell/nushell/assets/343840/3ebdb5cd-0150-448c-aff5-c57053e4788a)
How my NU_LIB_DIRS is set in the images above
```nushell
$env.NU_LIB_DIRS = [
($nu.default-config-dir | path join 'scripts') # add <nushell-config-dir>/scripts
'/Users/fdncred/src/nu_scripts'
($nu.config-path | path dirname)
($nu.data-dir | path join 'completions') # default home for nushell completions
]
```
Let the debate begin.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
#7777 removed the `--numbered` flag from `each`, `par-each`, `reduce`,
and `each while`. It was suggested at the time that it should be removed
from `for` as well, but for several reasons it wasn't.
This PR deprecates `--numbered` in anticipation of removing it in 0.96.
Note: Please review carefully, as this is my first "real" Rust/Nushell
code. I was hoping that some prior commit would be useful as a template,
but since this was an argument on a parser keyword, I didn't find too
much useful. So I had to actually find the relevant helpers in the code
and `nu_protocol` doc and learn how to use them - oh darn ;-) But please
make sure I did it correctly.
# User-Facing Changes
* Use of `--numbered` will result in a deprecation warning.
* Changed help example to demonstrate the new syntax.
* Help shows deprecation notice on the flag
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Part of https://github.com/nushell/nushell/issues/12963, step 2.
This PR refactors changes the use of `expression.span` to
`expression.span_id` via a new helper `Expression::span()`. A new
`GetSpan` is added to abstract getting the span from both `EngineState`
and `StateWorkingSet`.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
`format pattern` loses the ability to use variables in the pattern,
e.g., `... | format pattern 'value of {$it.name} is {$it.value}'`. This
is because the command did a custom parse-eval cycle, creating spans
that are not merged into the main engine state. We could clone the
engine state, add Clone trait to StateDelta and merge the cloned delta
to the cloned state, but IMO there is not much value from having this
ability, since we have string interpolation nowadays: `... | $"value of
($in.name) is ($in.value)"`.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close https://github.com/nushell/nushell/issues/12874
- fixes https://github.com/nushell/nushell/issues/12874
I want to fix the issue which is induced by the fix for
https://github.com/nushell/nushell/issues/12369. after this pr. This pr
induced a new error for unix system, in order to show coredump messages
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
- this PR should close https://github.com/nushell/nushell/issues/12874
- fixes https://github.com/nushell/nushell/issues/12874
I want to fix the issue which is induced by the fix for
https://github.com/nushell/nushell/issues/12369. after this pr. This pr
induced a new error for unix system, in order to show coredump messages
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
after fix for 12874, coredump message is messing, so I want to fix it
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
![image](https://github.com/nushell/nushell/assets/60290287/6d8ab756-3031-4212-a5f5-5f71be3857f9)
---------
Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
# Description
First part of SpanID refactoring series. This PR adds a `SpanId` type
and a corresponding `span_id` field to `Expression`. Parser creating
expressions will now add them to an array in `StateWorkingSet`,
generates a corresponding ID and saves the ID to the Expression. The IDs
are not used anywhere yet.
For the rough overall plan, see
https://github.com/nushell/nushell/issues/12963.
# User-Facing Changes
Hopefully none. This is only a refactor of Nushell's internals that
shouldn't have visible side effects.
# Tests + Formatting
# After Submitting
# Description
Currently, this pipeline doesn't work `open --raw file | take 100`,
since the type of the byte stream is `Unknown`, but `take` expects
`Binary` streams. This PR changes commands that expect
`ByteStreamType::Binary` to also work with `ByteStreamType::Unknown`.
This was done by adding two new methods to `ByteStreamType`:
`is_binary_coercible` and `is_string_coercible`. These return true if
the type is `Unknown` or matches the type in the method name.
# Description
Makes the `from json --objects` command produce a stream, and read
lazily from an input stream to produce its output.
Also added a helper, `PipelineData::get_type()`, to make it easier to
construct a wrong type error message when matching on `PipelineData`. I
expect checking `PipelineData` for either a string value or an `Unknown`
or `String` typed `ByteStream` will be very, very common. I would have
liked to have a helper that just returns a readable stream from either,
but that would either be a bespoke enum or a `Box<dyn BufRead>`, which
feels like it wouldn't be so great for performance. So instead, taking
the approach I did here is probably better - having a function that
accepts the `impl BufRead` and matching to use it.
# User-Facing Changes
- `from json --objects` no longer collects its input, and can be used
for large datasets or streams that produce values over time.
# Tests + Formatting
All passing.
# After Submitting
- [ ] release notes
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
As discussed in https://github.com/nushell/nushell/pull/12749, we no
longer need to call `std::env::set_current_dir()` to sync `$env.PWD`
with the actual working directory. This PR removes the call from
`EngineState::merge_env()`.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
1. With the `-l` flag, `debug profile` now collects files and line
numbers of profiled pipeline elements
![profiler_lines](https://github.com/nushell/nushell/assets/25571562/b400a956-d958-4aff-aa4c-7e65da3f78fa)
2. Error from the profiled closure will be reported instead of silently
ignored.
![profiler_lines_error](https://github.com/nushell/nushell/assets/25571562/54f7ad7a-06a3-4d56-92c2-c3466917bee8)
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
New `--lines(-l)` flag to `debug profile`. The command will also fail if
the profiled closure fails, so technically it is a breaking change.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
- **Remove unused `pathdiff` dep in `nu-cli`**
- **Remove unused `serde_json` dep on `nu-protocol`**
- Unnecessary after moving the plugin file to msgpack (still a
dev-dependency)
# Description
Removes the old `nu-cmd-dataframe` crate in favor of the polars plugin.
As such, this PR also removes the `dataframe` feature, related CI, and
full releases of nushell.
# Description
This PR allows byte streams to optionally be colored as being
specifically binary or string data, which guarantees that they'll be
converted to `Binary` or `String` appropriately on `into_value()`,
making them compatible with `Type` guarantees. This makes them
significantly more broadly usable for command input and output.
There is still an `Unknown` type for byte streams coming from external
commands, which uses the same behavior as we previously did where it's a
string if it's UTF-8.
A small number of commands were updated to take advantage of this, just
to prove the point. I will be adding more after this merges.
# User-Facing Changes
- New types in `describe`: `string (stream)`, `binary (stream)`
- These commands now return a stream if their input was a stream:
- `into binary`
- `into string`
- `bytes collect`
- `str join`
- `first` (binary)
- `last` (binary)
- `take` (binary)
- `skip` (binary)
- Streams that are explicitly binary colored will print as a streaming
hexdump
- example:
```nushell
1.. | each { into binary } | bytes collect
```
# Tests + Formatting
I've added some tests to cover it at a basic level, and it doesn't break
anything existing, but I do think more would be nice. Some of those will
come when I modify more commands to stream.
# After Submitting
There are a few things I'm not quite satisfied with:
- **String trimming behavior.** We automatically trim newlines from
streams from external commands, but I don't think we should do this with
internal commands. If I call a command that happens to turn my string
into a stream, I don't want the newline to suddenly disappear. I changed
this to specifically do it only on `Child` and `File`, but I don't know
if this is quite right, and maybe we should bring back the old flag for
`trim_end_newline`
- **Known binary always resulting in a hexdump.** It would be nice to
have a `print --raw`, so that we can put binary data on stdout
explicitly if we want to. This PR doesn't change how external commands
work though - they still dump straight to stdout.
Otherwise, here's the normal checklist:
- [ ] release notes
- [ ] docs update for plugin protocol changes (added `type` field)
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
# Description
Changes `get_full_help` to take a `&dyn Command` instead of multiple
arguments (`&Signature`, `&Examples` `is_parser_keyword`). All of these
arguments can be gathered from a `Command`, so there is no need to pass
the pieces to `get_full_help`.
This PR also fixes an issue where the search terms are not shown if
`--help` is used on a command.
# Description
Kind of a vague title, but this PR does two main things:
1. Rather than overriding functions like `Command::is_parser_keyword`,
this PR instead changes commands to override `Command::command_type`.
The `CommandType` returned by `Command::command_type` is then used to
automatically determine whether `Command::is_parser_keyword` and the
other `is_{type}` functions should return true. These changes allow us
to remove the `CommandType::Other` case and should also guarantee than
only one of the `is_{type}` functions on `Command` will return true.
2. Uses the new, reworked `Command::command_type` function in the `scope
commands` and `which` commands.
# User-Facing Changes
- Breaking change for `scope commands`: multiple columns (`is_builtin`,
`is_keyword`, `is_plugin`, etc.) have been merged into the `type`
column.
- Breaking change: the `which` command can now report `plugin` or
`keyword` instead of `built-in` in the `type` column. It may also now
report `external` instead of `custom` in the `type` column for known
`extern`s.
# Description
This PR makes some commands and areas of code preserve pipeline
metadata. This is in an attempt to make the issue described in #12599
and #9456 less likely to occur. That is, reading and writing to the same
file in a pipeline will result in an empty file. Since we preserve
metadata in more places now, there will be a higher chance that we
successfully detect this error case and abort the pipeline.
# Description
Forgot that I fixed this already on my branch, but when printing without
a display output hook, the implicit call to `table` gets its output
mangled with newlines (since #12774). This happens when running `nu -c`
or a script file.
Here's that fix in one PR so it can be merged easily.
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# Description
This PR adds a few functions to `Span` for merging spans together:
- `Span::append`: merges two spans that are known to be in order.
- `Span::concat`: returns a span that encompasses all the spans in a
slice. The spans must be in order.
- `Span::merge`: merges two spans (no order necessary).
- `Span::merge_many`: merges an iterator of spans into a single span (no
order necessary).
These are meant to replace the free-standing `nu_protocol::span`
function.
The spans in a `LiteCommand` (the `parts`) should always be in order
based on the lite parser and lexer. So, the parser code sees the most
usage of `Span::append` and `Span::concat` where the order is known. In
other code areas, `Span::merge` and `Span::merge_many` are used since
the order between spans is often not known.
# Description
This PR introduces a `ByteStream` type which is a `Read`-able stream of
bytes. Internally, it has an enum over three different byte stream
sources:
```rust
pub enum ByteStreamSource {
Read(Box<dyn Read + Send + 'static>),
File(File),
Child(ChildProcess),
}
```
This is in comparison to the current `RawStream` type, which is an
`Iterator<Item = Vec<u8>>` and has to allocate for each read chunk.
Currently, `PipelineData::ExternalStream` serves a weird dual role where
it is either external command output or a wrapper around `RawStream`.
`ByteStream` makes this distinction more clear (via `ByteStreamSource`)
and replaces `PipelineData::ExternalStream` in this PR:
```rust
pub enum PipelineData {
Empty,
Value(Value, Option<PipelineMetadata>),
ListStream(ListStream, Option<PipelineMetadata>),
ByteStream(ByteStream, Option<PipelineMetadata>),
}
```
The PR is relatively large, but a decent amount of it is just repetitive
changes.
This PR fixes#7017, fixes#10763, and fixes#12369.
This PR also improves performance when piping external commands. Nushell
should, in most cases, have competitive pipeline throughput compared to,
e.g., bash.
| Command | Before (MB/s) | After (MB/s) | Bash (MB/s) |
| -------------------------------------------------- | -------------:|
------------:| -----------:|
| `throughput \| rg 'x'` | 3059 | 3744 | 3739 |
| `throughput \| nu --testbin relay o> /dev/null` | 3508 | 8087 | 8136 |
# User-Facing Changes
- This is a breaking change for the plugin communication protocol,
because the `ExternalStreamInfo` was replaced with `ByteStreamInfo`.
Plugins now only have to deal with a single input stream, as opposed to
the previous three streams: stdout, stderr, and exit code.
- The output of `describe` has been changed for external/byte streams.
- Temporary breaking change: `bytes starts-with` no longer works with
byte streams. This is to keep the PR smaller, and `bytes ends-with`
already does not work on byte streams.
- If a process core dumped, then instead of having a `Value::Error` in
the `exit_code` column of the output returned from `complete`, it now is
a `Value::Int` with the negation of the signal number.
# After Submitting
- Update docs and book as necessary
- Release notes (e.g., plugin protocol changes)
- Adapt/convert commands to work with byte streams (high priority is
`str length`, `bytes starts-with`, and maybe `bytes ends-with`).
- Refactor the `tee` code, Devyn has already done some work on this.
---------
Co-authored-by: Devyn Cairns <devyn.cairns@gmail.com>
# Description
In order for `Stack::unwrap_unique` to work as intended, we currently
manually track all references to the parent stack and ensure that they
are cleared before calling `Stack::unwrap_unique` in the REPL. We also
only call `Stack::unwrap_unique` after all code from the current REPL
entry has finished executing. Since `Value`s cannot store `Stack`
references, then this should have worked in theory. However, we forgot
to account for threads. `run-external` (and maybe the plugin writers)
can spawn threads that clone the `Stack`, holding on to references of
the parent stack. These threads are not waited/joined upon, and so may
finish after the eval has already returned. This PR removes the
`Stack::unwrap_unique` function and associated debug assert that was
[causing
panics](https://gist.github.com/cablehead/f3d2608a1629e607c2d75290829354f7)
like @cablehead found.
# After Submitting
Make values cheaper to clone as a more robust solution to the
performance issues with cloning the stack.
---------
Co-authored-by: Wind <WindSoilder@outlook.com>
# Description
Following from #12867, this PR replaces usages of `Call::positional_nth`
with existing spans. This removes several `expect`s from the code.
Also remove unused `positional_nth_mut` and `positional_iter_mut`
# Description
In this PR I added two new methods to `Stack`, `stdout_file` and
`stderr_file`. These two modify the inner `StackOutDest` and set a
`File` into the `stdout` and `stderr` respectively. Different to the
`push_redirection` methods, these do not require to hold a guard up all
the time but require ownership of the stack.
This is primarly useful for applications that use `nu` as a language but
not the `nushell`.
This PR replaces my first attempt #12851 to add a way to capture
stdout/-err of external commands. Capturing the stdout without having to
write into a file is possible with crates like
[`os_pipe`](https://docs.rs/os_pipe), an example for this is given in
the doc comment of the `stdout_file` command and can be executed as a
doctest (although it doesn't validate that you actually got any data).
This implementation takes `File` as input to make it easier to implement
on different operating systems without having to worry about
`OwnedHandle` or `OwnedFd`. Also this doesn't expose any use `os_pipe`
to not leak its types into this API, making it depend on it.
As in my previous attempt, @IanManske guided me here.
# User-Facing Changes
This change has no effect on `nushell` and therefore no user-facing
changes.
# Tests + Formatting
This only exposes a new way of using already existing code and has
therefore no further testing. The doctest succeeds on my machine at
least (x86 Windows, 64 Bit).
# After Submitting
All the required documentation is already part of this PR.
This PR has two parts. The first part is the addition of the
`Stack::set_pwd()` API. It strips trailing slashes from paths for
convenience, but will reject otherwise bad paths, leaving PWD in a good
state. This should reduce the impact of faulty code incorrectly trying
to set PWD.
(https://github.com/nushell/nushell/pull/12760#issuecomment-2095393012)
The second part is implementing a PWD recovery mechanism. PWD can become
bad even when we did nothing wrong. For example, Unix allows you to
remove any directory when another process might still be using it, which
means PWD can just "disappear" under our nose. This PR makes it possible
to use `cd` to reset PWD into a good state. Here's a demonstration:
```sh
mkdir /tmp/foo
cd /tmp/foo
# delete "/tmp/foo" in a subshell, because Nushell is smart and refuse to delete PWD
nu -c 'cd /; rm -r /tmp/foo'
ls # Error: × $env.PWD points to a non-existent directory
# help: Use `cd` to reset $env.PWD into a good state
cd /
pwd # prints /
```
Also, auto-cd should be working again.
# Description
Refactors the code in `nu-cli`, `main.rs`, `run.rs`, and few others.
Namely, I added `EngineState::generate_nu_constant` function to
eliminate some duplicate code. Otherwise, I changed a bunch of areas to
return errors instead of calling `std::process::exit`.
# User-Facing Changes
Should be none.
# Description
Changes the iterator in `rm` to be an iterator over
`Result<Option<String>, ShellError>` (an optional message or error)
instead of an iterator over `Value`. Then, the iterator is consumed and
each message is printed. This allows the
`PipelineData::print_not_formatted` method to be removed.
# Description
On 64-bit platforms the current size of `Value` is 56 bytes. The
limiting variants were `Closure` and `Range`. Boxing the two reduces the
size of Value to 48 bytes. This is the minimal size possible with our
current 16-byte `Span` and any 24-byte `Vec` container which we use in
several variants. (Note the extra full 8-bytes necessary for the
discriminant or other smaller values due to the 8-byte alignment of
`usize`)
This is leads to a size reduction of ~15% for `Value` and should overall
be beneficial as both `Range` and `Closure` are rarely used compared to
the primitive types or even our general container types.
# User-Facing Changes
Less memory used, potential runtime benefits.
(Too late in the evening to run the benchmarks myself right now)
Refer to #12603 for part 1.
We need to be careful when migrating to the new API, because the new API
has slightly different semantics (PWD can contain symlinks). This PR
handles the "obviously safe" part of the migrations. Namely, it handles
two specific use cases:
* Passing PWD into `canonicalize_with()`
* Passing PWD into `EngineState::merge_env()`
The first case is safe because symlinks are canonicalized away. The
second case is safe because `EngineState::merge_env()` only uses PWD to
call `std::env::set_current_dir()`, which shouldn't affact Nushell. The
commit message contains detailed stats on the updated files.
Because these migrations touch a lot of files, I want to keep these PRs
small to avoid merge conflicts.
# Description
Does some misc changes to `ListStream`:
- Moves it into its own module/file separate from `RawStream`.
- `ListStream`s now have an associated `Span`.
- This required changes to `ListStreamInfo` in `nu-plugin`. Note sure if
this is a breaking change for the plugin protocol.
- Hides the internals of `ListStream` but also adds a few more methods.
- This includes two functions to more easily alter a stream (these take
a `ListStream` and return a `ListStream` instead of having to go through
the whole `into_pipeline_data(..)` route).
- `map`: takes a `FnMut(Value) -> Value`
- `modify`: takes a function to modify the inner stream.
PR https://github.com/nushell/nushell/pull/12603 made it so that PWD can
never contain a trailing slash. However, a root path (such as `/` or
`C:\`) technically counts as "having a trailing slash", so now `cd /`
doesn't work.
I feel dumb for missing such an obvious edge case. Let's just merge this
quickly before anyone else finds out...
EDIT: It appears I'm too late.
This is the first PR towards migrating to a new `$env.PWD` API that
returns potentially un-canonicalized paths. Refer to PR #12515 for
motivations.
## New API: `EngineState::cwd()`
The goal of the new API is to cover both parse-time and runtime use
case, and avoid unintentional misuse. It takes an `Option<Stack>` as
argument, which if supplied, will search for `$env.PWD` on the stack in
additional to the engine state. I think with this design, there's less
confusion over parse-time and runtime environments. If you have access
to a stack, just supply it; otherwise supply `None`.
## Deprecation of other PWD-related APIs
Other APIs are re-implemented using `EngineState::cwd()` and properly
documented. They're marked deprecated, but their behavior is unchanged.
Unused APIs are deleted, and code that accesses `$env.PWD` directly
without using an API is rewritten.
Deprecated APIs:
* `EngineState::current_work_dir()`
* `StateWorkingSet::get_cwd()`
* `env::current_dir()`
* `env::current_dir_str()`
* `env::current_dir_const()`
* `env::current_dir_str_const()`
Other changes:
* `EngineState::get_cwd()` (deleted)
* `StateWorkingSet::list_env()` (deleted)
* `repl::do_run_cmd()` (rewritten with `env::current_dir_str()`)
## `cd` and `pwd` now use logical paths by default
This pulls the changes from PR #12515. It's currently somewhat broken
because using non-canonicalized paths exposed a bug in our path
normalization logic (Issue #12602). Once that is fixed, this should
work.
## Future plans
This PR needs some tests. Which test helpers should I use, and where
should I put those tests?
I noticed that unquoted paths are expanded within `eval_filepath()` and
`eval_directory()` before they even reach the `cd` command. This means
every paths is expanded twice. Is this intended?
Once this PR lands, the plan is to review all usages of the deprecated
APIs and migrate them to `EngineState::cwd()`. In the meantime, these
usages are annotated with `#[allow(deprecated)]` to avoid breaking CI.
---------
Co-authored-by: Jakub Žádník <kubouch@gmail.com>
# Description
Removes lazy records from the language, following from the reasons
outlined in #12622. Namely, this should make semantics more clear and
will eliminate concerns regarding maintainability.
# User-Facing Changes
- Breaking change: `lazy make` is removed.
- Breaking change: `describe --collect-lazyrecords` flag is removed.
- `sys` and `debug info` now return regular records.
# After Submitting
- Update nushell book if necessary.
- Explore new `sys` and `debug info` APIs to prevent them from taking
too long (e.g., subcommands or taking an optional column/cell-path
argument).
# Description
This PR overhauls the shell_integration system by allowing individual
control over which ansi escape sequences are used. As we continue to
broaden our support for more ansi escape sequences, we can't really have
an all-or-nothing strategy. Some ansi escapes cause problems in certain
operating systems or terminals. We should allow the user to choose which
escapes they want.
TODO:
* Gather feedback
* Should osc7, osc9_9 and osc633p be mutually exclusive?
* Is the naming convention for these settings too nerdy osc2, osc7, etc?
closes#11301
# User-Facing Changes
shell_integration is no longer a boolean value. This is what is
supported in the default_config.nu
```nushell
shell_integration: {
# osc2 abbreviates the path if in the home_dir, sets the tab/window title, shows the running command in the tab/window title
osc2: true
# osc7 is a way to communicate the path to the terminal, this is helpful for spawning new tabs in the same directory
osc7: true
# osc8 is also implemented as the deprecated setting ls.show_clickable_links, it shows clickable links in ls output if your terminal supports it
osc8: true
# osc9_9 is from ConEmu and is starting to get wider support. It's similar to osc7 in that it communicates the path to the terminal
osc9_9: false
# osc133 is several escapes invented by Final Term which include the supported ones below.
# 133;A - Mark prompt start
# 133;B - Mark prompt end
# 133;C - Mark pre-execution
# 133;D;exit - Mark execution finished with exit code
# This is used to enable terminals to know where the prompt is, the command is, where the command finishes, and where the output of the command is
osc133: true
# osc633 is closely related to osc133 but only exists in visual studio code (vscode) and supports their shell integration features
# 633;A - Mark prompt start
# 633;B - Mark prompt end
# 633;C - Mark pre-execution
# 633;D;exit - Mark execution finished with exit code
# 633;E - NOT IMPLEMENTED - Explicitly set the command line with an optional nonce
# 633;P;Cwd=<path> - Mark the current working directory and communicate it to the terminal
# and also helps with the run recent menu in vscode
osc633: true
# reset_application_mode is escape \x1b[?1l and was added to help ssh work better
reset_application_mode: true
}
```
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
This PR adds raw string support by using `r#` at the beginning of single
quoted strings and `#` at the end.
Notice that escapes do not process, even within single quotes,
parentheses don't mean anything, $variables don't mean anything. It's
just a string.
```nushell
❯ echo r#'one\ntwo (blah) ($var)'#
one\ntwo (blah) ($var)
```
Notice how they work without `echo` or `print` and how they work without
carriage returns.
```nushell
❯ r#'adsfa'#
adsfa
❯ r##"asdfa'@qpejq'##
asdfa'@qpejq
❯ r#'asdfasdfasf
∙ foqwejfqo@'23rfjqf'#
```
They also have a special configurable color in the repl. (use single
quotes though)
![image](https://github.com/nushell/nushell/assets/343840/8780e21d-de4c-45b3-9880-2425f5fe10ef)
They should work like rust raw literals and allow `r##`, `r###`,
`r####`, etc, to help with having one or many `#`'s in the middle of
your raw-string.
They should work with `let` as well.
```nushell
r#'some\nraw\nstring'# | str upcase
```
closes https://github.com/nushell/nushell/issues/5091
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
---------
Co-authored-by: WindSoilder <WindSoilder@outlook.com>
Co-authored-by: Ian Manske <ian.manske@pm.me>
This PR changes `$env` to be **case-preserving** instead of
case-sensitive. That is, it preserves the case of the environment
variable when it is first assigned, but subsequent retrieval and update
ignores the case.
Notably, both `$env.PATH` and `$env.Path` can now be used to read or set
the environment variable, but child processes will always see the
correct case based on the platform.
Fixes#11268.
---
This feature was surprising simple to implement, because most of the
infrastructure to support case-insensitive cell path access already
exists. The `get` command extracts data using a cell path in a
case-insensitive way (!), but accepts a `--sensitive` flag. (I think
this should be flipped around?)
# Description
I found a bunch of issues relating to the specialized reimplementation
of `print()` that's done in `nu-cli` and it just didn't seem necessary.
So I tried to unify the behavior reasonably. `PipelineData::print()`
already handles the call to `table` and it even has a `no_newline`
option.
One of the most major issues before was that we were using the value
iterator, and then converting to string, and then printing each with
newlines. This doesn't work well for an external stream, because its
iterator ends up creating `Value::binary()` with each buffer... so we
were doing lossy UTF-8 conversion on those and then printing them with
newlines, which was very weird:
![Screenshot_2024-04-26_02-02-29](https://github.com/nushell/nushell/assets/10729/131c2224-08ee-4582-8617-6ecbb3ce8da5)
You can see the random newline inserted in a break between buffers, but
this would be even worse if it were on a multibyte UTF-8 character. You
can produce this by writing a large amount of text to a text file, and
then doing `nu -c 'open file.txt'` - in my case I just wrote `^find .`;
it just has to be large enough to trigger a buffer break.
Using `print()` instead led to a new issue though, because it doesn't
abort on errors. This is so that certain commands can produce a stream
of errors and have those all printed. There are tests for e.g. `rm` that
depend on this behavior. I assume we want to keep that, so instead I
made my target `BufferedReader`, and had that fuse closed if an error
was encountered. I can't imagine we want to keep reading from a wrapped
I/O stream if an error occurs; more often than not the error isn't going
to magically resolve itself, it's not going to be a different error each
time, and it's just going to lead to an infinite stream of the same
error.
The test that broke without that was `open . | lines`, because `lines`
doesn't fuse closed on error. But I don't know if it's expected or not
for it to do that, so I didn't target that.
I think this PR makes things better but I'll keep looking for ways to
improve on how errors and streams interact, especially trying to
eliminate cases where infinite error loops can happen.
# User-Facing Changes
- **Breaking**: `BufferedReader` changes + no more public fields
- A raw I/O stream from e.g. `open` won't produce infinite errors
anymore, but I consider that to be a plus
- the implicit `print` on script output is the same as the normal one
now
# Tests + Formatting
Everything passes but I didn't add anything specific.
# Description
I thought about bringing `nu_plugin_msgpack` in, but that is MPL with a
clause that prevents other licenses, so rather than adapt that code I
decided to take a crack at just doing it straight from `rmp` to `Value`
without any `rmpv` in the middle. It seems like it's probably faster,
though I can't say for sure how much with the plugin overhead.
@IanManske I started on a `Read` implementation for `RawStream` but just
specialized to `from msgpack` here, but I'm thinking after release maybe
we can polish it up and make it a real one. It works!
# User-Facing Changes
New commands:
- `from msgpack`
- `from msgpackz`
- `to msgpack`
- `to msgpackz`
# Tests + Formatting
Pretty thorough tests added for the format deserialization, with a
roundtrip for serialization. Some example tests too for both `from
msgpack` and `to msgpack`.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] update release notes
# Description
This PR does miscellaneous cleanup in some of the commands from
`nu-cmd-lang`.
# User-Facing Changes
None.
# After Submitting
Cleanup the other commands in `nu-cmd-lang`.
# Description
So far this seems like the winner of my poll on what the name should be.
I'll take this off draft once the poll expires, if this is indeed the
winner.
# Description
Continuing from #12568, this PR further reduces the size of `Expr` from
64 to 40 bytes. It also reduces `Expression` from 128 to 96 bytes and
`Type` from 32 to 24 bytes.
This was accomplished by:
- for `Expr` with multiple fields (e.g., `Expr::Thing(A, B, C)`),
merging the fields into new AST struct types and then boxing this struct
(e.g. `Expr::Thing(Box<ABC>)`).
- replacing `Vec<T>` with `Box<[T]>` in multiple places. `Expr`s and
`Expression`s should rarely be mutated, if at all, so this optimization
makes sense.
By reducing the size of these types, I didn't notice a large performance
improvement (at least compared to #12568). But this PR does reduce the
memory usage of nushell. My config is somewhat light so I only noticed a
difference of 1.4MiB (38.9MiB vs 37.5MiB).
---------
Co-authored-by: Stefan Holderbach <sholderbach@users.noreply.github.com>
# Description
This allows the following commands to all accept a filename instead of a
plugin name:
- `plugin use`
- `plugin rm`
- `plugin stop`
Slightly complicated because of the need to also check against
`NU_PLUGIN_DIRS`, but I also fixed some issues with that at the same
time
Requested by @fdncred
# User-Facing Changes
The new commands are updated as described.
# Tests + Formatting
Tests for `NU_PLUGIN_DIRS` handling also made more robust.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] Double check new docs to make sure they describe this capability
# Description
Adds a new keyword, `plugin use`. Unlike `register`, this merely loads
the signatures from the plugin cache file. The file is configurable with
the `--plugin-config` option either to `nu` or to `plugin use` itself,
just like the other `plugin` family of commands. At the REPL, one might
do this to replace `register`:
```nushell
> plugin add ~/.cargo/bin/nu_plugin_foo
> plugin use foo
```
This will not work in a script, because `plugin use` is a keyword and
`plugin add` does not evaluate at parse time (intentionally). This means
we no longer run random binaries during parse.
The `--plugins` option has been added to allow running `nu` with certain
plugins in one step. This is used especially for the `nu_with_plugins!`
test macro, but I'd imagine is generally useful. The only weird quirk is
that it has to be a list, and we don't really do this for any of our
other CLI args at the moment.
`register` now prints a deprecation parse warning.
This should fix#11923, as we now have a complete alternative to
`register`.
# User-Facing Changes
- Add `plugin use` command
- Deprecate `register`
- Add `--plugins` option to `nu` to replace a common use of `register`
# Tests + Formatting
I think I've tested it thoroughly enough and every existing test passes.
Testing nu CLI options and alternate config files is a little hairy and
I wish there were some more generic helpers for this, so this will go on
my TODO list for refactoring.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] Update plugins sections of book
- [ ] Release notes
# Description
This adds an extension trait to `Result` that wraps errors in `Spanned`,
saving the effort of calling `.map_err(|err| err.into_spanned(span))`
every time. This will hopefully make it even more likely that someone
will want to use a spanned `io::Error` and make it easier to remove the
impl for `From<io::Error> for ShellError` because that doesn't have span
information.
# Description
- Plugin signatures are now saved to `plugin.msgpackz`, which is
brotli-compressed MessagePack.
- The file is updated incrementally, rather than writing all plugin
commands in the engine every time.
- The file always contains the result of the `Signature` call to the
plugin, even if commands were removed.
- Invalid data for a particular plugin just causes an error to be
reported, but the rest of the plugins can still be parsed
# User-Facing Changes
- The plugin file has a different filename, and it's not a nushell
script.
- The default `plugin.nu` file will be automatically migrated the first
time, but not other plugin config files.
- We don't currently provide any utilities that could help edit this
file, beyond `plugin add` and `plugin rm`
- `from msgpackz`, `to msgpackz` could also help
- New commands: `plugin add`, `plugin rm`
# Tests + Formatting
Tests added for the format and for the invalid handling.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] Check for documentation changes
- [ ] Definitely needs release notes
# Description
`Value` describes the types of first-class values that users and scripts
can create, manipulate, pass around, and store. However, `Block`s are
not first-class values in the language, so this PR removes it from
`Value`. This removes some unnecessary code, and this change should be
invisible to the user except for the change to `scope modules` described
below.
# User-Facing Changes
Breaking change: the output of `scope modules` was changed so that
`env_block` is now `has_env_block` which is a boolean value instead of a
`Block`.
# After Submitting
Update the language guide possibly.
# Description
As suggested by @fdncred.
It's neat that this is possible, but the particularly useful part of
this is that we can actually
test it because it doesn't have any external dependencies, unlike the
python plugin.
Right now this just implements exactly the same behavior as the python
plugin, but we could have it
exercise a few more things.
Also fixes a couple of bugs:
- `.nu` plugins were not run with `nu --stdin`, so they couldn't take
input.
- `register` couldn't be called if `--no-config-file` was set, because
it would error on trying to
update the plugin file.
# User-Facing Changes
- `nu_plugin_nu_example` plugin added.
- `register` now works in `--no-config-file` mode.
# Tests + Formatting
Tests added for `nu_plugin_nu_example`.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] Add the version bump to the release script just like for python
# Description
EngineState now tracks the script currently running, instead of the
parent directory of the script. This also provides an easy way to expose
the current running script to the user (Issue #12195).
Similarly, StateWorkingSet now tracks scripts instead of directories.
`parsed_module_files` and `currently_parsed_pwd` are merged into one
variable, `scripts`, which acts like a stack for tracking the current
running script (which is on the top of the stack).
Circular import check is added for `source` operations, in addition to
module import. A simple testcase is added for circular source.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
It shouldn't have any user facing changes.
# Description
Adds a `Box` around the `ImportPattern` in `Expr` which decreases the
size of `Expr` from 152 to 64 bytes (and `Expression` from 216 to 128
bytes). This seems to speed up parsing a little bit according to the
benchmarks (main is top, PR is bottom):
```
benchmarks fastest │ slowest │ median │ mean │ samples │ iters
benchmarks fastest │ slowest │ median │ mean │ samples │ iters
├─ parser_benchmarks │ │ │ │ │
├─ parser_benchmarks │ │ │ │ │
│ ├─ parse_default_config_file 2.287 ms │ 4.532 ms │ 2.311 ms │ 2.437 ms │ 100 │ 100
│ ├─ parse_default_config_file 2.255 ms │ 2.781 ms │ 2.281 ms │ 2.312 ms │ 100 │ 100
│ ╰─ parse_default_env_file 421.8 µs │ 824.6 µs │ 494.3 µs │ 527.5 µs │ 100 │ 100
│ ╰─ parse_default_env_file 402 µs │ 486.6 µs │ 414.8 µs │ 416.2 µs │ 100 │ 100
```
# Description
This PR adds a `ListItem` enum to our set of AST types. It encodes the
two possible expressions inside of list expression: a singular item or a
spread. This is similar to the existing `RecordItem` enum. Adding
`ListItem` allows us to remove the existing `Expr::Spread` case which
was previously used for list spreads. As a consequence, this guarantees
(via the type system) that spreads can only ever occur inside lists,
records, or as command args.
This PR also does a little bit of cleanup in relevant parser code.
This is good practice as all our iterators will never return a value
after reaching `None`
The benefit should be minimal as only `Iterator::fuse` is directly
specialized and itself rarely used (sometimes in `itertools` adaptors)
Thus it is mostly a documentation thing
# Description
Adds support for running plugins using local socket communication
instead of stdio. This will be an optional thing that not all plugins
have to support.
This frees up stdio for use to make plugins that use stdio to create
terminal UIs, cc @amtoine, @fdncred.
This uses the [`interprocess`](https://crates.io/crates/interprocess)
crate (298 stars, MIT license, actively maintained), which seems to be
the best option for cross-platform local socket support in Rust. On
Windows, a local socket name is provided. On Unixes, it's a path. The
socket name is kept to a relatively small size because some operating
systems have pretty strict limits on the whole path (~100 chars), so on
macOS for example we prefer `/tmp/nu.{pid}.{hash64}.sock` where the hash
includes the plugin filename and timestamp to be unique enough.
This also adds an API for moving plugins in and out of the foreground
group, which is relevant for Unixes where direct terminal control
depends on that.
TODO:
- [x] Generate local socket path according to OS conventions
- [x] Add support for passing `--local-socket` to the plugin executable
instead of `--stdio`, and communicating over that instead
- [x] Test plugins that were broken, including
[amtoine/nu_plugin_explore](https://github.com/amtoine/nu_plugin_explore)
- [x] Automatically upgrade to using local sockets when supported,
falling back if it doesn't work, transparently to the user without any
visible error messages
- Added protocol feature: `LocalSocket`
- [x] Reset preferred mode to `None` on `register`
- [x] Allow plugins to detect whether they're running on a local socket
and can use stdio freely, so that TUI plugins can just produce an error
message otherwise
- Implemented via `EngineInterface::is_using_stdio()`
- [x] Clean up foreground state when plugin command exits on the engine
side too, not just whole plugin
- [x] Make sure tests for failure cases work as intended
- `nu_plugin_stress_internals` added
# User-Facing Changes
- TUI plugins work
- Non-Rust plugins could optionally choose to use this
- This might behave differently, so will need to test it carefully
across different operating systems
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] Document local socket option in plugin contrib docs
- [ ] Document how to do a terminal UI plugin in plugin contrib docs
- [ ] Document: `EnterForeground` engine call
- [ ] Document: `LeaveForeground` engine call
- [ ] Document: `LocalSocket` protocol feature
# Description
This adds a `SharedCow` type as a transparent copy-on-write pointer that
clones to unique on mutate.
As an initial test, the `Record` within `Value::Record` is shared.
There are some pretty big wins for performance. I'll post benchmark
results in a comment. The biggest winner is nested access, as that would
have cloned the records for each cell path follow before and it doesn't
have to anymore.
The reusability of the `SharedCow` type is nice and I think it could be
used to clean up the previous work I did with `Arc` in `EngineState`.
It's meant to be a mostly transparent clone-on-write that just clones on
`.to_mut()` or `.into_owned()` if there are actually multiple
references, but avoids cloning if the reference is unique.
# User-Facing Changes
- `Value::Record` field is a different type (plugin authors)
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] use for `EngineState`
- [ ] use for `Value::List`
# Description
I spent a while trying to come up with a good name for what is currently
`IoStream`. Looking back, this name is not the best, because it:
1. Implies that it is a stream, when it all it really does is specify
the output destination for a stream/pipeline.
2. Implies that it handles input and output, when it really only handles
output.
So, this PR renames `IoStream` to `OutDest` instead, which should be
more clear.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Fixes#7849, #11465 based on @kubouch's suggestion in
https://github.com/nushell/nushell/issues/11465#issuecomment-1883847806.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Can source files relative to `env.nu` or `config.nu` like in #6150.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
Adds test that previously failed.
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
This decouples the serialized representation of `Record` from its
internal implementation. It now gets treated as a map type in `serde`.
This has several benefits:
- more efficient representation (not showing inner fields)
- human readable e.g. as a JSON record
- no breaking changes when refactoring the `Record` internals in the
future (see #12326, or potential introduction of `indexmap::IndexMap`
for large N)
- we now deny the creation of invalid records a non-cooperating plugin
could produce
- guaranteed key-value correspondence
- checking for unique keys
# Breaking change to the plugin protocol:
Now expects a record/map directly as the `Record.val` field instead of a
serialization of it.
# Description
Currently, `Range` is a struct with a `from`, `to`, and `incr` field,
which are all type `Value`. This PR changes `Range` to be an enum over
`IntRange` and `FloatRange` for better type safety / stronger compile
time guarantees.
Fixes: #11778Fixes: #11777Fixes: #11776Fixes: #11775Fixes: #11774Fixes: #11773Fixes: #11769.
# User-Facing Changes
Hopefully none, besides bug fixes.
Although, the `serde` representation might have changed.
# Description
Requested by @ayax79. This makes the custom value behavior more correct,
by calling the methods on the plugin to handle the custom values in
examples rather than the methods on the custom values themselves. This
helps for handle-type custom values (like what he's doing with
dataframes).
- Equality checking in `PluginTest::test_examples()` changed to use
`PluginInterface::custom_value_partial_cmp()`
- Base value rendering for `PluginSignature` changed to use
`Plugin::custom_value_to_base_value()`
- Had to be moved closer to `serve_plugin` for this reason, so the test
for writing signatures containing custom values was removed
- That behavior should still be tested to some degree, since if custom
values are not handled, signatures will fail to parse, so all of the
other tests won't work.
# User-Facing Changes
- `Record::sort_cols()` method added to share functionality required by
`PartialCmp`, and it might also be slightly faster
- Otherwise, everything should mostly be the same but better. Plugins
that don't implement special handling for custom values will still work
the same way, because the default implementation is just a pass-through
to the `CustomValue` methods.
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# Description
Fixes#12382, where overlay changes from hooks were not preserved into
the global state. This was due to creating child stacks for hooks, when
the global stack should have been used instead.
# Description
This keeps plugin custom values that have requested drop notification
around during the lifetime of a plugin call / stream by sending them to
a channel that gets persisted during the lifetime of the call.
Before this change, it was very likely that the drop notification would
be sent before the plugin ever had a chance to handle the value it
received.
Tests have been added to make sure this works - see the `custom_values`
plugin.
cc @ayax79
# User-Facing Changes
This is basically just a bugfix, just a slightly big one.
However, I did add an `as_mut_any()` function for custom values, to
avoid having to clone them. This is a breaking change.
- [x] `cargo hack` feature flag compatibility run
- [x] reedline released and pinned
- [x] `nu-plugin-test-support` added to release script
- [x] dependency tree checked
- [x] release notes
# Description
This shrinks `Record`'s size in half and and allows you to include it in
`Value` without growing the size.
Changing the `Record` internals may have slightly different performance
characteristics as the cache locality changes on lookups (if you
directly need the value, it should be closer, but in other cases may
blow up the cache line budget)
Also different perf characteristics on creation expected.
`Record::from_raw_cols_vals` now probably worse.
## Benchmarking
Comparison with the main branch (boxed Record) revealed no significant
change to the creation but an improvement when accessing larger N.
The fact that this was more pronounced for nested access (still cloning
before nushell/nushell#12325) leads to the conclusion that this may
still be dominated by the smaller clone necessary for a 24-byte `Record`
over the previous 48 bytes.
# User-Facing Changes
Reduced memory usage
# Description
This changes the interface for plugins to always represent errors as
`LabeledError`s. This is good for altlang plugins, as it would suck for
them to have to implement and track `ShellError`. We save a lot of
generated code from the `ShellError` serde impl too, so `nu` and plugins
get to have a smaller binary size.
Reduces the release binary size by 1.2 MiB on my build configuration.
# User-Facing Changes
- Changes plugin protocol. `ShellError` no longer serialized.
- `ShellError` serialize output is different
- `ShellError` no longer deserializes to exactly the same value as
serialized
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] Document in plugin protocol reference
Those allocations are all small and insignificant in the grand scheme of
things and the optimizer may be able to resolve some of those but better
to be nice anyways.
Primarily inspired by the new
[`clippy::assigning_clones`](https://rust-lang.github.io/rust-clippy/master/index.html#/assigning_clones)
- **Avoid reallocs with `clone_from` in `nu-parser`**
- **Avoid realloc on assignment in `Stack`**
- **Fix `clippy::assigning_clones` in `nu-cli`**
- **Reuse allocations in `nu-explore` if possible**
# Description
This clone is not necessary and tanks the performance of deep nested
access.
As soon as we found the value, we know we discard the old value, so can
`std::mem::take` the inner (`impl Default for Value` to the rescue)
We may be able to further optimize this but not having to clone the
value is vital.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Closes#12253.
Exposes the option as "recursion_limit" under config.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
The config file now has a new option!
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
Nothing else...? Do let me know if there's something I've missed!
# Description
The second `Value` is redundant and will consume five extra bytes on
each transmission of a custom value to/from a plugin.
# User-Facing Changes
This is a breaking change to the plugin protocol.
The [example in the protocol
reference](https://www.nushell.sh/contributor-book/plugin_protocol_reference.html#value)
becomes
```json
{
"Custom": {
"val": {
"type": "PluginCustomValue",
"name": "database",
"data": [36, 190, 127, 40, 12, 3, 46, 83],
"notify_on_drop": true
},
"span": {
"start": 320,
"end": 340
}
}
}
```
instead of
```json
{
"CustomValue": {
...
}
}
```
# After Submitting
Update plugin protocol reference
# Description
This is something that was discussed in the core team meeting last
Wednesday. @ayax79 is building `nu-plugin-polars` with all of the
dataframe commands into a plugin, and there are a lot of them, so it
would help to make the API more similar. At the same time, I think the
`Command` API is just better anyway. I don't think the difference is
justified, and the types for core commands have the benefit of requiring
less `.into()` because they often don't own their data
- Broke `signature()` up into `name()`, `usage()`, `extra_usage()`,
`search_terms()`, `examples()`
- `signature()` returns `nu_protocol::Signature`
- `examples()` returns `Vec<nu_protocol::Example>`
- `PluginSignature` and `PluginExample` no longer need to be used by
plugin developers
# User-Facing Changes
Breaking API for plugins yet again 😄
# Description
When implementing a `Command`, one must also import all the types
present in the function signatures for `Command`. This makes it so that
we often import the same set of types in each command implementation
file. E.g., something like this:
```rust
use nu_protocol::ast::Call;
use nu_protocol::engine::{Command, EngineState, Stack};
use nu_protocol::{
record, Category, Example, IntoInterruptiblePipelineData, IntoPipelineData, PipelineData,
ShellError, Signature, Span, Type, Value,
};
```
This PR adds the `nu_engine::command_prelude` module which contains the
necessary and commonly used types to implement a `Command`:
```rust
// command_prelude.rs
pub use crate::CallExt;
pub use nu_protocol::{
ast::{Call, CellPath},
engine::{Command, EngineState, Stack},
record, Category, Example, IntoInterruptiblePipelineData, IntoPipelineData, IntoSpanned,
PipelineData, Record, ShellError, Signature, Span, Spanned, SyntaxShape, Type, Value,
};
```
This should reduce the boilerplate needed to implement a command and
also gives us a place to track the breadth of the `Command` API. I tried
to be conservative with what went into the prelude modules, since it
might be hard/annoying to remove items from the prelude in the future.
Let me know if something should be included or excluded.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Boxes `Record` inside `Value` to reduce memory usage, `Value` goes from
`72` -> `56` bytes after this change.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Fixes: #11887Fixes: #11626
This pr unify the tilde expand behavior over several filesystem relative
commands. It follows the same rule with glob expansion:
| command | result |
| ----------- | ------ |
| ls ~/aaa | expand tilde
| ls "~/aaa" | don't expand tilde
| let f = "~/aaa"; ls $f | don't expand tilde, if you want to: use `ls
($f \| path expand)`
| let f: glob = "~/aaa"; ls $f | expand tilde, they don't expand on
`mkdir`, `touch` comamnd.
Actually I'm not sure for 4th item, currently it's expanding is just
because it followes the same rule with glob expansion.
### About the change
It changes `expand_path_with` to accept a new argument called
`expand_tilde`, if it's true, expand it, if not, just keep it as `~`
itself.
# User-Facing Changes
After this change, `ls "~/aaa"` won't expand tilde.
# Tests + Formatting
Done
# Description
This commit fills in the completion item kind into the
`textDocument/completion` response so that LSP client can present more
information to the user.
It is an improvement in the context of #10794
# User-Facing Changes
Improved information display in editor's intelli-sense menu
![output](https://github.com/nushell/nushell/assets/16558417/991dc0a9-45d1-4718-8f22-29002d687b93)
# Description
Adds a `nu-plugin-test-support` crate with an interface that supports
testing plugins.
Unlike in reality, these plugins run in the same process on separate
threads. This will allow
testing aspects of the plugin internal state and handling serialized
plugin custom values easily.
We still serialize their custom values and all of the engine to plugin
logic is still in play, so
from a logical perspective this should still expose any bugs that would
have been caused by that.
The only difference is that it doesn't run in a different process, and
doesn't try to serialize
everything to the final wire format for stdin/stdout.
TODO still:
- [x] Clean up warnings about private types exposed in trait definition
- [x] Automatically deserialize plugin custom values in the result so
they can be inspected
- [x] Automatic plugin examples test function
- [x] Write a bit more documentation
- [x] More tests
- [x] Add MIT License file to new crate
# User-Facing Changes
Plugin developers get a nice way to test their plugins.
# Tests + Formatting
Run the tests with `cargo test -p nu-plugin-test-support --
--show-output` to see some examples of what the failing test output for
examples can look like. I used the `difference` crate (MIT licensed) to
make it look nice.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] Add a section to the book about testing
- [ ] Test some of the example plugins this way
- [ ] Add example tests to nu_plugin_template so plugin developers have
something to start with
# Description
Just a bunch of miscellaneous fixes to the Rust documentation that I
found recently while doing
a pass on some things.
# User-Facing Changes
None
# Description
With the release of Rust 1.77.0 today we're able to bump the
rust-toolchain for nushell to 1.75.0.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
This makes `LabeledError` much more capable of representing close to
everything a `miette::Diagnostic` can, including `ShellError`, and
allows plugins to generate multiple error spans, codes, help, etc.
`LabeledError` is now embeddable within `ShellError` as a transparent
variant.
This could also be used to improve `error make` and `try/catch` to
reflect `LabeledError` exactly in the future.
Also cleaned up some errors in existing plugins.
# User-Facing Changes
Breaking change for plugins. Nicer errors for users.
# Description
@sholderbach left a very helpful review and this just implements the
suggestions he made.
Didn't notice any difference in performance, but there could potentially
be for a long running Nushell session or one that loads a lot of stuff.
I also caught a bug where nu-protocol won't build without `plugin`
because of the previous conditional import. Oops. Fixed.
# User-Facing Changes
`blocks` and `modules` type in `EngineState` changed again. Shouldn't
affect plugins or anything else though really
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
---------
Co-authored-by: sholderbach <sholderbach@users.noreply.github.com>
# Description
Get rid of two parallel `Vec`s in `StateDelta` and `EngineState`, that
also duplicated span information. Use a struct with documenting fields.
Also use `Arc<str>` and `Arc<[u8]>` for the allocations as they are
never modified and cloned often (see #12229 for the first improvement).
This also makes the representation more compact as no capacity is
necessary.
# User-Facing Changes
API breakage on `EngineState`/`StateWorkingSet`/`StateDelta` that should
not really affect plugin authors.
# Description
This makes many of the larger objects in `EngineState` into `Arc`, and
uses `Arc::make_mut` to do clone-on-write if the reference is not
unique. This is generally very cheap, giving us the best of both worlds
- allowing us to mutate without cloning if we have an exclusive
reference, and cloning if we don't.
This started as more of a curiosity for me after remembering that
`Arc::make_mut` exists and can make using `Arc` for mostly immutable
data that sometimes needs to be changed very convenient, and also after
hearing someone complain about memory usage on Discord - this is a
somewhat significant win for that.
The exact objects that were wrapped in `Arc`:
- `files`, `file_contents` - the strings and byte buffers
- `decls` - the whole `Vec`, but mostly to avoid lots of individual
`malloc()` calls on Clone rather than for memory usage
- `blocks` - the blocks themselves, rather than the outer Vec
- `modules` - the modules themselves, rather than the outer Vec
- `env_vars`, `previous_env_vars` - the entire maps
- `config`
The changes required were relatively minimal, but this is a breaking API
change. In particular, blocks are added as Arcs, to allow the parser
cache functionality to work.
With my normal nu config, running on Linux, this saves me about 15 MiB
of process memory usage when running interactively (65 MiB → 50 MiB).
This also makes quick command executions cheaper, particularly since
every REPL loop now involves a clone of the engine state so that we can
recover from a panic. It also reduces memory usage where engine state
needs to be cloned and sent to another thread or kept within an
iterator.
# User-Facing Changes
Shouldn't be any, since it's all internal stuff, but it does change some
public interfaces so it's a breaking change
[Context on
Discord](https://discord.com/channels/601130461678272522/855947301380947968/1219425984990806207)
# Description
- Rename `CustomValue::value_string()` to `type_name()` to reflect its
usage better.
- Change print behavior to always call `to_base_value()` first, to give
the custom value better control over the output.
- Change `describe --detailed` to show the type name as the subtype,
rather than trying to describe the base value.
- Change custom `Type` to use `type_name()` rather than `typetag_name()`
to make things like `PluginCustomValue` more transparent
One question: should `describe --detailed` still include a description
of the base value somewhere? I'm torn on it, it seems possibly useful
for some things (maybe sqlite databases?), but having `describe -d` not
include the custom type name anywhere felt weird. Another option would
be to add another method to `CustomValue` for info to be displayed in
`describe`, so that it can be more type-specific?
# User-Facing Changes
Everything above has implications for printing and `describe` on custom
values
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# Description
We do a lot of visiting contained values in the serialization / validity
functions within `PluginCustomValue` utils. This adds
`Value::recurse_mut()` which wraps up most of that logic into something
that can be reused.
# Description
Fixes#12193 where the `$in` value may be null for closures provided to
`insert`.
# User-Facing Changes
The `$in` value will now always be the same as the closure parameter for
`insert`.
[Context on
Discord](https://discord.com/channels/601130461678272522/855947301380947968/1216517833312309419)
# Description
This is a significant breaking change to the plugin API, but one I think
is worthwhile. @ayax79 mentioned on Discord that while trying to start
on a dataframes plugin, he was a little disappointed that more wasn't
provided in terms of code organization for commands, particularly since
there are *a lot* of `dfr` commands.
This change treats plugins more like miniatures of the engine, with
dispatch of the command name being handled inherently, each command
being its own type, and each having their own signature within the trait
impl for the command type rather than having to find a way to centralize
it all into one `Vec`.
For the example plugins that have multiple commands, I definitely like
how this looks a lot better. This encourages doing code organization the
right way and feels very good.
For the plugins that have only one command, it's just a little bit more
boilerplate - but still worth it, in my opinion.
The `Box<dyn PluginCommand<Plugin = Self>>` type in `commands()` is a
little bit hairy, particularly for Rust beginners, but ultimately not so
bad, and it gives the desired flexibility for shared state for a whole
plugin + the individual commands.
# User-Facing Changes
Pretty big breaking change to plugin API, but probably one that's worth
making.
```rust
use nu_plugin::*;
use nu_protocol::{PluginSignature, PipelineData, Type, Value};
struct LowercasePlugin;
struct Lowercase;
// Plugins can now have multiple commands
impl PluginCommand for Lowercase {
type Plugin = LowercasePlugin;
// The signature lives with the command
fn signature(&self) -> PluginSignature {
PluginSignature::build("lowercase")
.usage("Convert each string in a stream to lowercase")
.input_output_type(Type::List(Type::String.into()), Type::List(Type::String.into()))
}
// We also provide SimplePluginCommand which operates on Value like before
fn run(
&self,
plugin: &LowercasePlugin,
engine: &EngineInterface,
call: &EvaluatedCall,
input: PipelineData,
) -> Result<PipelineData, LabeledError> {
let span = call.head;
Ok(input.map(move |value| {
value.as_str()
.map(|string| Value::string(string.to_lowercase(), span))
// Errors in a stream should be returned as values.
.unwrap_or_else(|err| Value::error(err, span))
}, None)?)
}
}
// Plugin now just has a list of commands, and the custom value op stuff still goes here
impl Plugin for LowercasePlugin {
fn commands(&self) -> Vec<Box<dyn PluginCommand<Plugin=Self>>> {
vec![Box::new(Lowercase)]
}
}
fn main() {
serve_plugin(&LowercasePlugin{}, MsgPackSerializer)
}
```
Time this however you like - we're already breaking stuff for 0.92, so
it might be good to do it now, but if it feels like a lot all at once,
it could wait.
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] Update examples in the book
- [x] Fix#12088 to match - this change would actually simplify it a
lot, because the methods are currently just duplicated between `Plugin`
and `StreamingPlugin`, but they only need to be on `Plugin` with this
change
# Description
The PR overhauls how IO redirection is handled, allowing more explicit
and fine-grain control over `stdout` and `stderr` output as well as more
efficient IO and piping.
To summarize the changes in this PR:
- Added a new `IoStream` type to indicate the intended destination for a
pipeline element's `stdout` and `stderr`.
- The `stdout` and `stderr` `IoStream`s are stored in the `Stack` and to
avoid adding 6 additional arguments to every eval function and
`Command::run`. The `stdout` and `stderr` streams can be temporarily
overwritten through functions on `Stack` and these functions will return
a guard that restores the original `stdout` and `stderr` when dropped.
- In the AST, redirections are now directly part of a `PipelineElement`
as a `Option<Redirection>` field instead of having multiple different
`PipelineElement` enum variants for each kind of redirection. This
required changes to the parser, mainly in `lite_parser.rs`.
- `Command`s can also set a `IoStream` override/redirection which will
apply to the previous command in the pipeline. This is used, for
example, in `ignore` to allow the previous external command to have its
stdout redirected to `Stdio::null()` at spawn time. In contrast, the
current implementation has to create an os pipe and manually consume the
output on nushell's side. File and pipe redirections (`o>`, `e>`, `e>|`,
etc.) have precedence over overrides from commands.
This PR improves piping and IO speed, partially addressing #10763. Using
the `throughput` command from that issue, this PR gives the following
speedup on my setup for the commands below:
| Command | Before (MB/s) | After (MB/s) | Bash (MB/s) |
| --------------------------- | -------------:| ------------:|
-----------:|
| `throughput o> /dev/null` | 1169 | 52938 | 54305 |
| `throughput \| ignore` | 840 | 55438 | N/A |
| `throughput \| null` | Error | 53617 | N/A |
| `throughput \| rg 'x'` | 1165 | 3049 | 3736 |
| `(throughput) \| rg 'x'` | 810 | 3085 | 3815 |
(Numbers above are the median samples for throughput)
This PR also paves the way to refactor our `ExternalStream` handling in
the various commands. For example, this PR already fixes the following
code:
```nushell
^sh -c 'echo -n "hello "; sleep 0; echo "world"' | find "hello world"
```
This returns an empty list on 0.90.1 and returns a highlighted "hello
world" on this PR.
Since the `stdout` and `stderr` `IoStream`s are available to commands
when they are run, then this unlocks the potential for more convenient
behavior. E.g., the `find` command can disable its ansi highlighting if
it detects that the output `IoStream` is not the terminal. Knowing the
output streams will also allow background job output to be redirected
more easily and efficiently.
# User-Facing Changes
- External commands returned from closures will be collected (in most
cases):
```nushell
1..2 | each {|_| nu -c "print a" }
```
This gives `["a", "a"]` on this PR, whereas this used to print "a\na\n"
and then return an empty list.
```nushell
1..2 | each {|_| nu -c "print -e a" }
```
This gives `["", ""]` and prints "a\na\n" to stderr, whereas this used
to return an empty list and print "a\na\n" to stderr.
- Trailing new lines are always trimmed for external commands when
piping into internal commands or collecting it as a value. (Failure to
decode the output as utf-8 will keep the trailing newline for the last
binary value.) In the current nushell version, the following three code
snippets differ only in parenthesis placement, but they all also have
different outputs:
1. `1..2 | each { ^echo a }`
```
a
a
╭────────────╮
│ empty list │
╰────────────╯
```
2. `1..2 | each { (^echo a) }`
```
╭───┬───╮
│ 0 │ a │
│ 1 │ a │
╰───┴───╯
```
3. `1..2 | (each { ^echo a })`
```
╭───┬───╮
│ 0 │ a │
│ │ │
│ 1 │ a │
│ │ │
╰───┴───╯
```
But in this PR, the above snippets will all have the same output:
```
╭───┬───╮
│ 0 │ a │
│ 1 │ a │
╰───┴───╯
```
- All existing flags on `run-external` are now deprecated.
- File redirections now apply to all commands inside a code block:
```nushell
(nu -c "print -e a"; nu -c "print -e b") e> test.out
```
This gives "a\nb\n" in `test.out` and prints nothing. The same result
would happen when printing to stdout and using a `o>` file redirection.
- External command output will (almost) never be ignored, and ignoring
output must be explicit now:
```nushell
(^echo a; ^echo b)
```
This prints "a\nb\n", whereas this used to print only "b\n". This only
applies to external commands; values and internal commands not in return
position will not print anything (e.g., `(echo a; echo b)` still only
prints "b").
- `complete` now always captures stderr (`do` is not necessary).
# After Submitting
The language guide and other documentation will need to be updated.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Because `std::fs::canonicalize` requires the path to exist, this PR
makes it so that when canonicalizing any config file, the
`$nu.default-config-dir/nushell` part is canonicalized first, then
`$nu.default-config-dir/nushell/foo.nu` is canonicalized.
This should also fix the issue @devyn pointed out
[here](https://github.com/nushell/nushell/pull/12118#issuecomment-1989546708)
where a couple of the tests failed if one's `~/.config/nushell` folder
was actually a symlink to a different folder. The tests previously
didn't canonicalize the expected paths.
I was going to make a PR that caches the config directory on startup (as
suggested by fdncred and Ian in Discord), but I can make that part of
this PR if we want to avoid creating unnecessary PRs. I think it
probably makes more sense to separate them though.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Adds support for the following operations on plugin custom values, in
addition to `to_base_value` which was already present:
- `follow_path_int()`
- `follow_path_string()`
- `partial_cmp()`
- `operation()`
- `Drop` (notification, if opted into with
`CustomValue::notify_plugin_on_drop`)
There are additionally customizable methods within the `Plugin` and
`StreamingPlugin` traits for implementing these functions in a way that
requires access to the plugin state, as a registered handle model such
as might be used in a dataframes plugin would.
`Value::append` was also changed to handle custom values correctly.
# User-Facing Changes
- Signature of `CustomValue::follow_path_string` and
`CustomValue::follow_path_int` changed to give access to the span of the
custom value itself, useful for some errors.
- Plugins using custom values have to be recompiled because the engine
will try to do custom value operations that aren't supported
- Plugins can do more things 🎉
# Tests + Formatting
Tests were added for all of the new custom values functionality.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] Document protocol reference `CustomValueOp` variants:
- [ ] `FollowPathInt`
- [ ] `FollowPathString`
- [ ] `PartialCmp`
- [ ] `Operation`
- [ ] `Dropped`
- [ ] Document `notify_on_drop` optional field in `PluginCustomValue`
# Description
Fixes some ignored clippy lints.
# User-Facing Changes
Changes some signatures and return types to `&dyn Command` instead of
`&Box<dyn Command`, but I believe this is only an internal change.
Part of the doccomment was an implementation note on the `im` crate that
hasn't been used for ages.
(If I recall we maybe even received a comment on discord on this)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
Closes#12103
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
As described in #12103, this PR makes Nushell use `XDG_CONFIG_HOME` as
the config directory if it exists. Otherwise, it uses the old behavior,
which was to use `dirs_next::config_dir()`.
Edit: We discussed choosing between `XDG_CONFIG_HOME` and the default
config directory in Discord and decided against it, at least for now.
<s>@kubouch also suggested letting users choose between
`XDG_CONFIG_HOME` and the default config directory if config files
aren't found on startup and `XDG_CONFIG_HOME` is set to a value
different from the default config directory</s>
On Windows and MacOS, if the `XDG_CONFIG_HOME` variable is set but
`XDG_CONFIG_HOME` is either empty or doesn't exist *and* the old config
directory is non-empty, Nushell will issue a warning on startup saying
that it won't move files from the old config directory to the new one.
To do this, I had to add a `nu_path::config_dir_old()` function. I
assume that at some point, we will remove the warning message and the
function can be removed too. Alternatively, instead of having that
function there, `main.rs` could directly call `dirs_next::config_dir()`.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
When `$env.XDG_CONFIG_HOME` is set to an absolute path, Nushell will use
`$"($env.XDG_CONFIG_HOME)/nushell"` as its config directory (previously,
this only worked on Linux).
To use `App Data\Roaming` (Windows) or `Library/Application Support`
(MacOS) instead (the old behavior), one can either leave
`XDG_CONFIG_HOME` unset or set it to an empty string.
If `XDG_CONFIG_HOME` is set, but to a non-absolute/invalid path, Nushell
will report an error on startup and use the default config directory
instead:
![image](https://github.com/nushell/nushell/assets/45539777/a434fe04-b7c8-4e95-b50c-80628008ad08)
On Windows and MacOS, if the `XDG_CONFIG_HOME` variable is set but
`XDG_CONFIG_HOME` is either empty or doesn't exist *and* the old config
directory is non-empty, Nushell will issue a warning on startup saying
that it won't move files from the old config directory to the new one.
![image](https://github.com/nushell/nushell/assets/45539777/1686cc17-4083-4c12-aecf-1d832460ca57)
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
The existing config path tests have been modified to use
`XDG_CONFIG_HOME` to change the config directory on all OSes, not just
Linux.
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
The documentation will have to be updated to note that Nushell uses
`XDG_CONFIG_HOME` now. As @fdncred pointed out, it's possible for people
to set `XDG_CONFIG_HOME` to, say, `~/.config/nushell` rather than
`~/.config`, so the documentation could warn about that mistake.
This is partially "feng-shui programming" of moving things to new
separate places.
The later commits include "`git blame` tollbooths" by moving out chunks
of code into new files, which requires an extra step to track things
with `git blame`. We can negiotiate if you want to keep particular
things in their original place.
If egregious I tried to add a bit of documentation. If I see something
that is unused/unnecessarily `pub` I will try to remove that.
- Move `nu_protocol::Exportable` to `nu-parser`
- Guess doccomment for `Exportable`
- Move `Unit` enum from `value` to `AST`
- Move engine state `Variable` def into its folder
- Move error-related files in `nu-protocol` subdir
- Move `pipeline_data` module into its own folder
- Move `stream.rs` over into the `pipeline_data` mod
- Move `PipelineMetadata` into its own file
- Doccomment `PipelineMetadata`
- Remove unused `is_leap_year` in `value/mod`
- Note about criminal `type_compatible` helper
- Move duration fmting into new `value/duration.rs`
- Move filesize fmting logic to new `value/filesize`
- Split reexports from standard imports in `value/mod`
- Doccomment trait `CustomValue`
- Polish doccomments and intradoc links
# Description
This PR uses the new plugin protocol to intelligently keep plugin
processes running in the background for further plugin calls.
Running plugins can be seen by running the new `plugin list` command,
and stopped by running the new `plugin stop` command.
This is an enhancement for the performance of plugins, as starting new
plugin processes has overhead, especially for plugins in languages that
take a significant amount of time on startup. It also enables plugins
that have persistent state between commands, making the migration of
features like dataframes and `stor` to plugins possible.
Plugins are automatically stopped by the new plugin garbage collector,
configurable with `$env.config.plugin_gc`:
```nushell
$env.config.plugin_gc = {
# Configuration for plugin garbage collection
default: {
enabled: true # true to enable stopping of inactive plugins
stop_after: 10sec # how long to wait after a plugin is inactive to stop it
}
plugins: {
# alternate configuration for specific plugins, by name, for example:
#
# gstat: {
# enabled: false
# }
}
}
```
If garbage collection is enabled, plugins will be stopped after
`stop_after` passes after they were last active. Plugins are counted as
inactive if they have no running plugin calls. Reading the stream from
the response of a plugin call is still considered to be activity, but if
a plugin holds on to a stream but the call ends without an active
streaming response, it is not counted as active even if it is reading
it. Plugins can explicitly disable the GC as appropriate with
`engine.set_gc_disabled(true)`.
The `version` command now lists plugin names rather than plugin
commands. The list of plugin commands is accessible via `plugin list`.
Recommend doing this together with #12029, because it will likely force
plugin developers to do the right thing with mutability and lead to less
unexpected behavior when running plugins nested / in parallel.
# User-Facing Changes
- new command: `plugin list`
- new command: `plugin stop`
- changed command: `version` (now lists plugin names, rather than
commands)
- new config: `$env.config.plugin_gc`
- Plugins will keep running and be reused, at least for the configured
GC period
- Plugins that used mutable state in weird ways like `inc` did might
misbehave until fixed
- Plugins can disable GC if they need to
- Had to change plugin signature to accept `&EngineInterface` so that
the GC disable feature works. #12029 does this anyway, and I'm expecting
(resolvable) conflicts with that
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
Because there is some specific OS behavior required for plugins to not
respond to Ctrl-C directly, I've developed against and tested on both
Linux and Windows to ensure that works properly.
# After Submitting
I think this probably needs to be in the book somewhere
# Description
This allows plugins to make calls back to the engine to get config,
evaluate closures, and do other things that must be done within the
engine process.
Engine calls can both produce and consume streams as necessary. Closures
passed to plugins can both accept stream input and produce stream output
sent back to the plugin.
Engine calls referring to a plugin call's context can be processed as
long either the response hasn't been received, or the response created
streams that haven't ended yet.
This is a breaking API change for plugins. There are some pretty major
changes to the interface that plugins must implement, including:
1. Plugins now run with `&self` and must be `Sync`. Executing multiple
plugin calls in parallel is supported, and there's a chance that a
closure passed to a plugin could invoke the same plugin. Supporting
state across plugin invocations is left up to the plugin author to do in
whichever way they feel best, but the plugin object itself is still
shared. Even though the engine doesn't run multiple plugin calls through
the same process yet, I still considered it important to break the API
in this way at this stage. We might want to consider an optional
threadpool feature for performance.
2. Plugins take a reference to `EngineInterface`, which can be cloned.
This interface allows plugins to make calls back to the engine,
including for getting config and running closures.
3. Plugins no longer take the `config` parameter. This can be accessed
from the interface via the `.get_plugin_config()` engine call.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Not only does this have plugin protocol changes, it will require plugins
to make some code changes before they will work again. But on the plus
side, the engine call feature is extensible, and we can add more things
to it as needed.
Plugin maintainers will have to change the trait signature at the very
least. If they were using `config`, they will have to call
`engine.get_plugin_config()` instead.
If they were using the mutable reference to the plugin, they will have
to come up with some strategy to work around it (for example, for `Inc`
I just cloned it). This shouldn't be such a big deal at the moment as
it's not like plugins have ever run as daemons with persistent state in
the past, and they don't in this PR either. But I thought it was
important to make the change before we support plugins as daemons, as an
exclusive mutable reference is not compatible with parallel plugin
calls.
I suggest this gets merged sometime *after* the current pending release,
so that we have some time to adjust to the previous plugin protocol
changes that don't require code changes before making ones that do.
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
I will document the additional protocol features (`EngineCall`,
`EngineCallResponse`), and constraints on plugin call processing if
engine calls are used - basically, to be aware that an engine call could
result in a nested plugin call, so the plugin should be able to handle
that.
This is another attempt on #11288
This allows for a `Stack` to have a parent stack (behind an `Arc`). This
is being added to avoid constant stack copying in REPL code.
Concretely the following changes are included here:
- `Stack` can now have a `parent_stack`, pointing to another stack
- variable lookups can fallback to this parent stack (env vars and
everything else is still copied)
- REPL code has been reworked so that we use parenting rather than
cloning. A REPL-code-specific trait helps to ensure that we do not
accidentally trigger a full clone of the main stack
- A property test has been added to make sure that parenting "looks the
same" as cloning for consumers of `Stack` objects
---------
Co-authored-by: Raphael Gaschignard <rtpg@rokkenjima.local>
Co-authored-by: Ian Manske <ian.manske@pm.me>
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
This PR adds a new evaluator path with callbacks to a mutable trait
object implementing a Debugger trait. The trait object can do anything,
e.g., profiling, code coverage, step debugging. Currently,
entering/leaving a block and a pipeline element is marked with
callbacks, but more callbacks can be added as necessary. Not all
callbacks need to be used by all debuggers; unused ones are simply empty
calls. A simple profiler is implemented as a proof of concept.
The debugging support is implementing by making `eval_xxx()` functions
generic depending on whether we're debugging or not. This has zero
computational overhead, but makes the binary slightly larger (see
benchmarks below). `eval_xxx()` variants called from commands (like
`eval_block_with_early_return()` in `each`) are chosen with a dynamic
dispatch for two reasons: to not grow the binary size due to duplicating
the code of many commands, and for the fact that it isn't possible
because it would make Command trait objects object-unsafe.
In the future, I hope it will be possible to allow plugin callbacks such
that users would be able to implement their profiler plugins instead of
having to recompile Nushell.
[DAP](https://microsoft.github.io/debug-adapter-protocol/) would also be
interesting to explore.
Try `help debug profile`.
## Screenshots
Basic output:
![profiler_new](https://github.com/nushell/nushell/assets/25571562/418b9df0-b659-4dcb-b023-2d5fcef2c865)
To profile with more granularity, increase the profiler depth (you'll
see that repeated `is-windows` calls take a large chunk of total time,
making it a good candidate for optimizing):
![profiler_new_m3](https://github.com/nushell/nushell/assets/25571562/636d756d-5d56-460c-a372-14716f65f37f)
## Benchmarks
### Binary size
Binary size increase vs. main: **+40360 bytes**. _(Both built with
`--release --features=extra,dataframe`.)_
### Time
```nushell
# bench_debug.nu
use std bench
let test = {
1..100
| each {
ls | each {|row| $row.name | str length }
}
| flatten
| math avg
}
print 'debug:'
let res2 = bench { debug profile $test } --pretty
print $res2
```
```nushell
# bench_nodebug.nu
use std bench
let test = {
1..100
| each {
ls | each {|row| $row.name | str length }
}
| flatten
| math avg
}
print 'no debug:'
let res1 = bench { do $test } --pretty
print $res1
```
`cargo run --release -- bench_debug.nu` is consistently 1--2 ms slower
than `cargo run --release -- bench_nodebug.nu` due to the collection
overhead + gathering the report. This is expected. When gathering more
stuff, the overhead is obviously higher.
`cargo run --release -- bench_nodebug.nu` vs. `nu bench_nodebug.nu` I
didn't measure any difference. Both benchmarks report times between 97
and 103 ms randomly, without one being consistently higher than the
other. This suggests that at least in this particular case, when not
running any debugger, there is no runtime overhead.
## API changes
This PR adds a generic parameter to all `eval_xxx` functions that forces
you to specify whether you use the debugger. You can resolve it in two
ways:
* Use a provided helper that will figure it out for you. If you wanted
to use `eval_block(&engine_state, ...)`, call `let eval_block =
get_eval_block(&engine_state); eval_block(&engine_state, ...)`
* If you know you're in an evaluation path that doesn't need debugger
support, call `eval_block::<WithoutDebug>(&engine_state, ...)` (this is
the case of hooks, for example).
I tried to add more explanation in the docstring of `debugger_trait.rs`.
## TODO
- [x] Better profiler output to reduce spam of iterative commands like
`each`
- [x] Resolve `TODO: DEBUG` comments
- [x] Resolve unwraps
- [x] Add doc comments
- [x] Add usage and extra usage for `debug profile`, explaining all
columns
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Hopefully none.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
This PR introduces [workspaces
dependencies](https://doc.rust-lang.org/cargo/reference/workspaces.html#the-dependencies-table).
The advantages are:
- a single place where dependency versions are declared
- reduces the number of files to change when upgrading a dependency
- reduces the risk of accidentally depending on 2 different versions of
the same dependency
I've only done a few so far. If this PR is accepted, I might continue
and progressively do the rest.
# User-Facing Changes
N/A
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
N/A
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
This PR makes sure `$nu.default-config-dir` and `$nu.plugin-path` are
canonicalized.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
`$nu.default-config-dir` (and `$nu.plugin-path`) will now give canonical
paths, with symlinks and whatnot resolved.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
I've added a couple of tests to check that even if the config folder
and/or any of the config files within are symlinks, the `$nu.*`
variables are properly canonicalized. These tests unfortunately only run
on Linux and MacOS, because I couldn't figure out how to change the
config directory on Windows. Also, given that they involve creating
files, I'm not sure if they're excessive, so I could remove one or two
of them.
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Replace panics with errors in thread spawning.
Also adds `IntoSpanned` trait for easily constructing `Spanned`, and an
implementation of `From<Spanned<std::io::Error>>` for `ShellError`,
which is used to provide context for the error wherever there was a span
conveniently available. In general this should make it more convenient
to do the right thing with `std::io::Error` and always add a span to it
when it's possible to do so.
# User-Facing Changes
Fewer panics!
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# Description
As title, currently on latest main, nushell confused user if it allows
implicit casting between glob and string:
```nushell
let x = "*.txt"
def glob-test [g: glob] { open $g }
glob-test $x
```
It always expand the glob although `$x` is defined as a string.
This pr implements a solution from @kubouch :
> We could make it really strict and disallow all autocasting between
globs and strings because that's what's causing the "magic" confusion.
Then, modify all builtins that accept globs to accept oneof(glob,
string) and the rules would be that globs always expand and strings
never expand
# User-Facing Changes
After this pr, user needs to use `into glob` to invoke `glob-test`, if
user pass a string variable:
```nushell
let x = "*.txt"
def glob-test [g: glob] { open $g }
glob-test ($x | into glob)
```
Or else nushell will return an error.
```
3 │ glob-test $x
· ─┬
· ╰── can't convert string to glob
```
# Tests + Formatting
Done
# After Submitting
Nan
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Currently, there's multiple places that look for a config directory, and
each of them has different error messages when it can't be found. This
PR makes a `ConfigDirNotFound` error to standardize the error message
for all of these cases.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Previously, the errors in `create_nu_constant()` would say which config
file Nushell was trying to get when it couldn't find the config
directory. Now it doesn't. However, I think that's fine, given that it
doesn't matter whether it couldn't find the config directory while
looking for `login.nu` or `env.nu`, it only matters that it couldn't
find it.
This is what the error looks like:
![image](https://github.com/nushell/nushell/assets/45539777/52298ed4-f9e9-4900-bb94-1154d389efa7)
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
---------
Co-authored-by: Antoine Stevan <44101798+amtoine@users.noreply.github.com>
Bumps [strum_macros](https://github.com/Peternator7/strum) from 0.25.3
to 0.26.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/Peternator7/strum/releases">strum_macros's
releases</a>.</em></p>
<blockquote>
<h2>v0.26.1</h2>
<h2>0.26.1</h2>
<ul>
<li><a
href="https://redirect.github.com/Peternator7/strum/pull/325">#325</a>:
use <code>core</code> instead of <code>std</code> in VariantArray.</li>
</ul>
<h2>0.26.0</h2>
<h3>Breaking Changes</h3>
<ul>
<li>The <code>EnumVariantNames</code> macro has been renamed
<code>VariantNames</code>. The deprecation warning should steer you in
the right direction for fixing the warning.</li>
<li>The Iterator struct generated by EnumIter now has new bounds on it.
This shouldn't break code unless you manually
added the implementation in your code.</li>
<li><code>Display</code> now supports format strings using named fields
in the enum variant. This should be a no-op for most code.
However, if you were outputting a string like <code>"Hello
{field}"</code>, this will now be interpretted as a format
string.</li>
<li>EnumDiscriminant now inherits the repr and discriminant values from
your main enum. This makes the discriminant type
closer to a mirror of the original and that's always the goal.</li>
</ul>
<h3>New features</h3>
<ul>
<li>
<p>The <code>VariantArray</code> macro has been added. This macro adds
an associated constant <code>VARIANTS</code> to your enum. The constant
is a <code>&'static [Self]</code> slice so that you can access all
the variants of your enum. This only works on enums that only
have unit variants.</p>
<pre lang="rust"><code>use strum::VariantArray;
<p>#[derive(Debug, VariantArray)]
enum Color {
Red,
Blue,
Green,
}</p>
<p>fn main() {
println!("{:?}", Color::VARIANTS); // prints:
["Red", "Blue", "Green"]
}
</code></pre></p>
</li>
<li>
<p>The <code>EnumTable</code> macro has been <em>experimentally</em>
added. This macro adds a new type that stores an item for each variant
of the enum. This is useful for storing a value for each variant of an
enum. This is an experimental feature because
I'm not convinced the current api surface area is correct.</p>
<pre lang="rust"><code>use strum::EnumTable;
<p>#[derive(Copy, Clone, Debug, EnumTable)]
enum Color {
Red,
Blue,
</code></pre></p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/Peternator7/strum/blob/master/CHANGELOG.md">strum_macros's
changelog</a>.</em></p>
<blockquote>
<h2>0.26.1</h2>
<ul>
<li><a
href="https://redirect.github.com/Peternator7/strum/pull/325">#325</a>:
use <code>core</code> instead of <code>std</code> in VariantArray.</li>
</ul>
<h2>0.26.0</h2>
<h3>Breaking Changes</h3>
<ul>
<li>The <code>EnumVariantNames</code> macro has been renamed
<code>VariantNames</code>. The deprecation warning should steer you in
the right direction for fixing the warning.</li>
<li>The Iterator struct generated by EnumIter now has new bounds on it.
This shouldn't break code unless you manually
added the implementation in your code.</li>
<li><code>Display</code> now supports format strings using named fields
in the enum variant. This should be a no-op for most code.
However, if you were outputting a string like <code>"Hello
{field}"</code>, this will now be interpretted as a format
string.</li>
<li>EnumDiscriminant now inherits the repr and discriminant values from
your main enum. This makes the discriminant type
closer to a mirror of the original and that's always the goal.</li>
</ul>
<h3>New features</h3>
<ul>
<li>
<p>The <code>VariantArray</code> macro has been added. This macro adds
an associated constant <code>VARIANTS</code> to your enum. The constant
is a <code>&'static [Self]</code> slice so that you can access all
the variants of your enum. This only works on enums that only
have unit variants.</p>
<pre lang="rust"><code>use strum::VariantArray;
<p>#[derive(Debug, VariantArray)]
enum Color {
Red,
Blue,
Green,
}</p>
<p>fn main() {
println!("{:?}", Color::VARIANTS); // prints:
["Red", "Blue", "Green"]
}
</code></pre></p>
</li>
<li>
<p>The <code>EnumTable</code> macro has been <em>experimentally</em>
added. This macro adds a new type that stores an item for each variant
of the enum. This is useful for storing a value for each variant of an
enum. This is an experimental feature because
I'm not convinced the current api surface area is correct.</p>
<pre lang="rust"><code>use strum::EnumTable;
<p>#[derive(Copy, Clone, Debug, EnumTable)]
enum Color {
Red,
Blue,
Green,
</code></pre></p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/Peternator7/strum/commits/v0.26.1">compare
view</a></li>
</ul>
</details>
<br />
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=strum_macros&package-manager=cargo&previous-version=0.25.3&new-version=0.26.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
[Discord
context](https://discord.com/channels/601130461678272522/615962413203718156/1211158641793695744)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Span fields were previously renamed to `internal_span` to discourage
their use in Rust code, but this change also affected the serde I/O for
Value. I don't believe the Python plugin was ever updated to reflect
this change.
This effectively changes it back, but just for the serialized form.
There are good reasons for doing this:
1. `internal_span` is a much longer name, and would be one of the most
common strings found in serialized Value data, probably bulking up the
plugin I/O
2. This change was never really meant to have implications for plugins,
and was just meant to be a hint that `.span()` should be used instead in
Rust code.
When Span refactoring is complete, the serialized form of Value will
probably change again in some significant way, so I think for now it's
best that it's left like this.
This has implications for #11911, particularly for documentation and for
the Python plugin as that was already updated in that PR to reflect
`internal_span`. If this is merged first, I will update that PR.
This would probably be considered a breaking change as it would break
plugin I/O compatibility (but not Rust code). I think it can probably go
in any major release though - all things considered, it's pretty minor,
and users are already expected to recompile plugins for new major
versions. However, it may also be worth holding off to do it together
with #11911 as that PR makes breaking changes in general a little bit
friendlier.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Requires plugin recompile.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
Nothing outside of `Value` itself had to be changed to make tests pass.
I did not check the Python plugin and whether it works now, but it was
broken before. It may work again as I think the main incompatibility it
had was expecting to use `span`
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
Avoid unnecessary allocations or larger iterator structs
- Turn static `Vec`s into arrays when possible
- Use `std::iter::once`/`empty` where applicable
- Use `bool::then_some` in `detect column` `.chain`
- Drop in the bucket: de-vec-ing tests