Part of the doccomment was an implementation note on the `im` crate that
hasn't been used for ages.
(If I recall we maybe even received a comment on discord on this)
This is partially "feng-shui programming" of moving things to new
separate places.
The later commits include "`git blame` tollbooths" by moving out chunks
of code into new files, which requires an extra step to track things
with `git blame`. We can negiotiate if you want to keep particular
things in their original place.
If egregious I tried to add a bit of documentation. If I see something
that is unused/unnecessarily `pub` I will try to remove that.
- Move `nu_protocol::Exportable` to `nu-parser`
- Guess doccomment for `Exportable`
- Move `Unit` enum from `value` to `AST`
- Move engine state `Variable` def into its folder
- Move error-related files in `nu-protocol` subdir
- Move `pipeline_data` module into its own folder
- Move `stream.rs` over into the `pipeline_data` mod
- Move `PipelineMetadata` into its own file
- Doccomment `PipelineMetadata`
- Remove unused `is_leap_year` in `value/mod`
- Note about criminal `type_compatible` helper
- Move duration fmting into new `value/duration.rs`
- Move filesize fmting logic to new `value/filesize`
- Split reexports from standard imports in `value/mod`
- Doccomment trait `CustomValue`
- Polish doccomments and intradoc links
# Description
This PR uses the new plugin protocol to intelligently keep plugin
processes running in the background for further plugin calls.
Running plugins can be seen by running the new `plugin list` command,
and stopped by running the new `plugin stop` command.
This is an enhancement for the performance of plugins, as starting new
plugin processes has overhead, especially for plugins in languages that
take a significant amount of time on startup. It also enables plugins
that have persistent state between commands, making the migration of
features like dataframes and `stor` to plugins possible.
Plugins are automatically stopped by the new plugin garbage collector,
configurable with `$env.config.plugin_gc`:
```nushell
$env.config.plugin_gc = {
# Configuration for plugin garbage collection
default: {
enabled: true # true to enable stopping of inactive plugins
stop_after: 10sec # how long to wait after a plugin is inactive to stop it
}
plugins: {
# alternate configuration for specific plugins, by name, for example:
#
# gstat: {
# enabled: false
# }
}
}
```
If garbage collection is enabled, plugins will be stopped after
`stop_after` passes after they were last active. Plugins are counted as
inactive if they have no running plugin calls. Reading the stream from
the response of a plugin call is still considered to be activity, but if
a plugin holds on to a stream but the call ends without an active
streaming response, it is not counted as active even if it is reading
it. Plugins can explicitly disable the GC as appropriate with
`engine.set_gc_disabled(true)`.
The `version` command now lists plugin names rather than plugin
commands. The list of plugin commands is accessible via `plugin list`.
Recommend doing this together with #12029, because it will likely force
plugin developers to do the right thing with mutability and lead to less
unexpected behavior when running plugins nested / in parallel.
# User-Facing Changes
- new command: `plugin list`
- new command: `plugin stop`
- changed command: `version` (now lists plugin names, rather than
commands)
- new config: `$env.config.plugin_gc`
- Plugins will keep running and be reused, at least for the configured
GC period
- Plugins that used mutable state in weird ways like `inc` did might
misbehave until fixed
- Plugins can disable GC if they need to
- Had to change plugin signature to accept `&EngineInterface` so that
the GC disable feature works. #12029 does this anyway, and I'm expecting
(resolvable) conflicts with that
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
Because there is some specific OS behavior required for plugins to not
respond to Ctrl-C directly, I've developed against and tested on both
Linux and Windows to ensure that works properly.
# After Submitting
I think this probably needs to be in the book somewhere
# Description
This allows plugins to make calls back to the engine to get config,
evaluate closures, and do other things that must be done within the
engine process.
Engine calls can both produce and consume streams as necessary. Closures
passed to plugins can both accept stream input and produce stream output
sent back to the plugin.
Engine calls referring to a plugin call's context can be processed as
long either the response hasn't been received, or the response created
streams that haven't ended yet.
This is a breaking API change for plugins. There are some pretty major
changes to the interface that plugins must implement, including:
1. Plugins now run with `&self` and must be `Sync`. Executing multiple
plugin calls in parallel is supported, and there's a chance that a
closure passed to a plugin could invoke the same plugin. Supporting
state across plugin invocations is left up to the plugin author to do in
whichever way they feel best, but the plugin object itself is still
shared. Even though the engine doesn't run multiple plugin calls through
the same process yet, I still considered it important to break the API
in this way at this stage. We might want to consider an optional
threadpool feature for performance.
2. Plugins take a reference to `EngineInterface`, which can be cloned.
This interface allows plugins to make calls back to the engine,
including for getting config and running closures.
3. Plugins no longer take the `config` parameter. This can be accessed
from the interface via the `.get_plugin_config()` engine call.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Not only does this have plugin protocol changes, it will require plugins
to make some code changes before they will work again. But on the plus
side, the engine call feature is extensible, and we can add more things
to it as needed.
Plugin maintainers will have to change the trait signature at the very
least. If they were using `config`, they will have to call
`engine.get_plugin_config()` instead.
If they were using the mutable reference to the plugin, they will have
to come up with some strategy to work around it (for example, for `Inc`
I just cloned it). This shouldn't be such a big deal at the moment as
it's not like plugins have ever run as daemons with persistent state in
the past, and they don't in this PR either. But I thought it was
important to make the change before we support plugins as daemons, as an
exclusive mutable reference is not compatible with parallel plugin
calls.
I suggest this gets merged sometime *after* the current pending release,
so that we have some time to adjust to the previous plugin protocol
changes that don't require code changes before making ones that do.
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
I will document the additional protocol features (`EngineCall`,
`EngineCallResponse`), and constraints on plugin call processing if
engine calls are used - basically, to be aware that an engine call could
result in a nested plugin call, so the plugin should be able to handle
that.
This is another attempt on #11288
This allows for a `Stack` to have a parent stack (behind an `Arc`). This
is being added to avoid constant stack copying in REPL code.
Concretely the following changes are included here:
- `Stack` can now have a `parent_stack`, pointing to another stack
- variable lookups can fallback to this parent stack (env vars and
everything else is still copied)
- REPL code has been reworked so that we use parenting rather than
cloning. A REPL-code-specific trait helps to ensure that we do not
accidentally trigger a full clone of the main stack
- A property test has been added to make sure that parenting "looks the
same" as cloning for consumers of `Stack` objects
---------
Co-authored-by: Raphael Gaschignard <rtpg@rokkenjima.local>
Co-authored-by: Ian Manske <ian.manske@pm.me>
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
This PR adds a new evaluator path with callbacks to a mutable trait
object implementing a Debugger trait. The trait object can do anything,
e.g., profiling, code coverage, step debugging. Currently,
entering/leaving a block and a pipeline element is marked with
callbacks, but more callbacks can be added as necessary. Not all
callbacks need to be used by all debuggers; unused ones are simply empty
calls. A simple profiler is implemented as a proof of concept.
The debugging support is implementing by making `eval_xxx()` functions
generic depending on whether we're debugging or not. This has zero
computational overhead, but makes the binary slightly larger (see
benchmarks below). `eval_xxx()` variants called from commands (like
`eval_block_with_early_return()` in `each`) are chosen with a dynamic
dispatch for two reasons: to not grow the binary size due to duplicating
the code of many commands, and for the fact that it isn't possible
because it would make Command trait objects object-unsafe.
In the future, I hope it will be possible to allow plugin callbacks such
that users would be able to implement their profiler plugins instead of
having to recompile Nushell.
[DAP](https://microsoft.github.io/debug-adapter-protocol/) would also be
interesting to explore.
Try `help debug profile`.
## Screenshots
Basic output:

To profile with more granularity, increase the profiler depth (you'll
see that repeated `is-windows` calls take a large chunk of total time,
making it a good candidate for optimizing):

## Benchmarks
### Binary size
Binary size increase vs. main: **+40360 bytes**. _(Both built with
`--release --features=extra,dataframe`.)_
### Time
```nushell
# bench_debug.nu
use std bench
let test = {
1..100
| each {
ls | each {|row| $row.name | str length }
}
| flatten
| math avg
}
print 'debug:'
let res2 = bench { debug profile $test } --pretty
print $res2
```
```nushell
# bench_nodebug.nu
use std bench
let test = {
1..100
| each {
ls | each {|row| $row.name | str length }
}
| flatten
| math avg
}
print 'no debug:'
let res1 = bench { do $test } --pretty
print $res1
```
`cargo run --release -- bench_debug.nu` is consistently 1--2 ms slower
than `cargo run --release -- bench_nodebug.nu` due to the collection
overhead + gathering the report. This is expected. When gathering more
stuff, the overhead is obviously higher.
`cargo run --release -- bench_nodebug.nu` vs. `nu bench_nodebug.nu` I
didn't measure any difference. Both benchmarks report times between 97
and 103 ms randomly, without one being consistently higher than the
other. This suggests that at least in this particular case, when not
running any debugger, there is no runtime overhead.
## API changes
This PR adds a generic parameter to all `eval_xxx` functions that forces
you to specify whether you use the debugger. You can resolve it in two
ways:
* Use a provided helper that will figure it out for you. If you wanted
to use `eval_block(&engine_state, ...)`, call `let eval_block =
get_eval_block(&engine_state); eval_block(&engine_state, ...)`
* If you know you're in an evaluation path that doesn't need debugger
support, call `eval_block::<WithoutDebug>(&engine_state, ...)` (this is
the case of hooks, for example).
I tried to add more explanation in the docstring of `debugger_trait.rs`.
## TODO
- [x] Better profiler output to reduce spam of iterative commands like
`each`
- [x] Resolve `TODO: DEBUG` comments
- [x] Resolve unwraps
- [x] Add doc comments
- [x] Add usage and extra usage for `debug profile`, explaining all
columns
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Hopefully none.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
Provides the ability to cleanly recover from panics, falling back to the
last known good state of EngineState and Stack. This pull request also
utilizes miette's panic handler for better formatting of panics.
<img width="642" alt="Screenshot 2024-02-21 at 08 34 35"
src="https://github.com/nushell/nushell/assets/56345/f81efaba-aa45-4e47-991c-1a2cf99e06ff">
---------
Co-authored-by: Jack Wright <jack.wright@disqo.com>
# Description
This PR renames the conversion functions on `Value` to be more consistent.
It follows the Rust [API guidelines](https://rust-lang.github.io/api-guidelines/naming.html#ad-hoc-conversions-follow-as_-to_-into_-conventions-c-conv) for ad-hoc conversions.
The conversion functions on `Value` now come in a few forms:
- `coerce_{type}` takes a `&Value` and attempts to convert the value to
`type` (e.g., `i64` are converted to `f64`). This is the old behavior of
some of the `as_{type}` functions -- these functions have simply been
renamed to better reflect what they do.
- The new `as_{type}` functions take a `&Value` and returns an `Ok`
result only if the value is of `type` (no conversion is attempted). The
returned value will be borrowed if `type` is non-`Copy`, otherwise an
owned value is returned.
- `into_{type}` exists for non-`Copy` types, but otherwise does not
attempt conversion just like `as_type`. It takes an owned `Value` and
always returns an owned result.
- `coerce_into_{type}` has the same relationship with `coerce_{type}` as
`into_{type}` does with `as_{type}`.
- `to_{kind}_string`: conversion to different string formats (debug,
abbreviated, etc.). Only two of the old string conversion functions were
removed, the rest have been renamed only.
- `to_{type}`: other conversion functions. Currently, only `to_path`
exists. (And `to_string` through `Display`.)
This table summaries the above:
| Form | Cost | Input Ownership | Output Ownership | Converts `Value`
case/`type` |
| ---------------------------- | ----- | --------------- |
---------------- | -------- |
| `as_{type}` | Cheap | Borrowed | Borrowed/Owned | No |
| `into_{type}` | Cheap | Owned | Owned | No |
| `coerce_{type}` | Cheap | Borrowed | Borrowed/Owned | Yes |
| `coerce_into_{type}` | Cheap | Owned | Owned | Yes |
| `to_{kind}_string` | Expensive | Borrowed | Owned | Yes |
| `to_{type}` | Expensive | Borrowed | Owned | Yes |
# User-Facing Changes
Breaking API change for `Value` in `nu-protocol` which is exposed as
part of the plugin API.
# Description
Adds a CLI flag for nushell that disables reading and writing to the
history file. This will be useful for future testing and possibly our
users as well. To borrow `fish` shell's terminology, this allows users
to start nushell in "private" mode.
# User-Facing Changes
Breaking API change for `nu-protocol` (changed `Config`).
# Description
When nushell calls a plugin it now sends a configuration `Value` from
the nushell config under `$env.config.plugins.PLUGIN_SHORT_NAME`. This
allows plugin authors to read configuration provided by plugin users.
The `PLUGIN_SHORT_NAME` must match the registered filename after
`nu_plugin_`. If you register `target/debug/nu_plugin_config` the
`PLUGIN_NAME` will be `config` and the nushell config will loook like:
$env.config = {
# ...
plugins: {
config: [
some
values
]
}
}
Configuration may also use a closure which allows passing values from
`$env` to a plugin:
$env.config = {
# ...
plugins: {
config: {||
$env.some_value
}
}
}
This is a breaking change for the plugin API as the `Plugin::run()`
function now accepts a new configuration argument which is an
`&Option<Value>`. If no configuration was supplied the value is `None`.
Plugins compiled after this change should work with older nushell, and
will behave as if the configuration was not set.
Initially discussed in #10867
# User-Facing Changes
* Plugins can read configuration data stored in `$env.config.plugins`
* The plugin `CallInfo` now includes a `config` entry, existing plugins
will require updates
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] Update [Creating a plugin (in
Rust)](https://www.nushell.sh/contributor-book/plugins.html#creating-a-plugin-in-rust)
[source](https://github.com/nushell/nushell.github.io/blob/main/contributor-book/plugins.md)
- [ ] Add "Configuration" section to [Plugins
documentation](https://www.nushell.sh/contributor-book/plugins.html)
# Description
Simplifies `SIGQUIT` protection to a single `signal` ignore system call.
# User-Facing Changes
`SIGQUIT` is no longer blocked if nushell is in non-interactive mode
(signals should not be blocked in non-interactive mode).
Also a breaking API change for `nu_protocol`.
# Tests + Formatting
Should come after #11178 for testing.
# Description
While #11057 is merged, it's hard to tell the difference between
`--flag: bool` and `--flag`, and it makes user hard to read custom
commands' signature, and hard to use them correctly.
After discussion, I think we can deprecate `--flag: bool` usage, and
encourage using `--flag` instead.
# User-Facing Changes
The following code will raise warning message, but don't stop from
running.
```nushell
❯ def florb [--dry-run: bool, --another-flag] { "aaa" }; florb
Error: × Deprecated: --flag: bool
╭─[entry #7:1:1]
1 │ def florb [--dry-run: bool, --another-flag] { "aaa" }; florb
· ──┬─
· ╰── `--flag: bool` is deprecated. Please use `--flag` instead, more info: https://www.nushell.sh/book/custom_commands.html
╰────
aaa
```
cc @kubouch
# Tests + Formatting
Done
# After Submitting
- [ ] Add more information under
https://www.nushell.sh/book/custom_commands.html to indicate `--dry-run:
bool` is not allowed,
- [ ] remove `: bool` from custom commands between 0.89 and 0.90
---------
Co-authored-by: Antoine Stevan <44101798+amtoine@users.noreply.github.com>
# Description
Replace `.to_string()` used in `GenericError` with `.into()` as
`.into()` seems more popular
Replace `Vec::new()` used in `GenericError` with `vec![]` as `vec![]`
seems more popular
(There are so, so many)
Trying to call `metadata $env` or `metadata $nu` will throw an error:
```Nushell
~> metadata $nu
Error: × Built-in variables `$env` and `$nu` have no metadata
╭─[entry #1:1:1]
1 │ metadata $nu
· ─┬─
· ╰── no metadata available
╰────
```
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Clippy fixes for rust 1.76.0-nightly
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
N/A
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Our config exists both as a `Config` struct for internal consumption and
as a `Value`. The latter is exposed through `$env.config` and can be
both set and read.
Thus we have a complex bug-prone mechanism, that reads a `Value` and
then tries to plug anything where the value is unrepresentable in
`Config` with the correct state from `Config`.
The parsing involves therefore mutation of the `Value` in a nested
`Record` structure. Previously this was wholy done manually, with
indices.
To enable deletion for example, things had to be iterated over from the
back. Also things were indexed in a bunch of places. This was hard to
read and an invitation for bugs.
With #10876 we can now use `Record::retain_mut` to traverse the records,
modify anything that needs fixing, and drop invalid fields.
# Parts:
- Error messages now consistently use the correct spans pointing to the
problematic value and the paths displayed in some messages are also
aligned with the keys used for lookup.
- Reconstruction of values has been fixed for:
- `table.padding`
- `buffer_editor`
- `hooks.command_not_found`
- `datetime_format` (partial solution)
- Fix validation of `table.padding` input so value is not set (and
underflows `usize` causing `table` to run forever with negative values)
- New proper types for settings. Fully validated enums instead of
strings:
- `config.edit_mode` -> `EditMode`
- Don't fall back to vi-mode on invalid string
- `config.table.mode` -> `TableMode`
- there is still a fall back to `rounded` if given an invalid
`TableMode` as argument to the `nu` binary
- `config.completions.algorithm` -> `CompletionAlgorithm`
- `config.error_style` -> `ErrorStyle`
- don't implicitly fall back to `fancy` when given an invalid value.
- This should also shrink the size of `Config` as instead of 4x24 bytes
those fields now need only 4x1 bytes in `Config`
- Completely removed macros relying on the scope of `Value::into_config`
so we can break it up into smaller parts in the future.
- Factored everything into smaller files with the types and helpers for
particular topics.
- `NuCursorShape` now explicitly expresses the `Inherit` setting.
conversion to option only happens at the interface to `reedline`
# Description
Changes the `captures` field in `Closure` from a `HashMap` to a `Vec`
and makes `Stack::captures_to_stack` take an owned `Vec` instead of a
borrowed `HashMap`.
This eliminates the conversion to a `Vec` inside `captures_to_stack` and
makes it possible to avoid clones altogether when using an owned
`Closure` (which is the case for most commands). Additionally, using a
`Vec` reduces the size of `Value` by 8 bytes (down to 72).
# User-Facing Changes
Breaking API change for `nu-protocol`.
# Description
Reuses the existing `Closure` type in `Value::Closure`. This will help
with the span refactoring for `Value`. Additionally, this allows us to
more easily box or unbox the `Closure` case should we chose to do so in
the future.
# User-Facing Changes
Breaking API change for `nu_protocol`.
# Description
Support pattern matching against the `null` literal. Fixes#10799
### Before
```nushell
> match null { null => "success", _ => "failure" }
failure
```
### After
```nushell
> match null { null => "success", _ => "failure" }
success
```
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Users can pattern match against a `null` literal.
related to
- https://github.com/nushell/nushell/pull/10478
# Description
this PR is the followup removal to
https://github.com/nushell/nushell/pull/10478.
# User-Facing Changes
`$nothing` is now an undefined variable, unless define by the user.
```nushell
> $nothing
Error: nu::parser::variable_not_found
× Variable not found.
╭─[entry #1:1:1]
1 │ $nothing
· ────┬───
· ╰── variable not found.
╰────
```
# Tests + Formatting
# After Submitting
mention that in release notes
related to
- https://github.com/nushell/nushell/pull/9973
- https://github.com/nushell/nushell/pull/9918
thanks to @jntrnr and their super useful tips on this PR, i learned
about the parser + evaluation, so 🙏
# Description
because we already have `null` as the value of the type `nothing` and as
a followup to the two other attempts of mine, i propose to remove the
redundant `$nothing` built-in variable 😋
this PR is the first step, deprecating `$nothing`.
a followup PR will remove it altogether and wait for 0.87 👍⚙️ **details**: a new `NOTHING_VARIABLE_ID = 3` has been added,
parsing `$nothing` will create it, finally a `Value::Nothing` will be
produced and a warning will be reported.
this PR already fixes the `toolkit.nu` module so that it does not throw
a bunch of warnings each time 👌
# User-Facing Changes
`$nothing` is now deprecated and will be removed in 0.87
```nushell
> $nothing
Error: × Deprecated variable
╭─[entry #1:1:1]
1 │ $nothing
· ────┬───
· ╰── `$nothing` is deprecated and will be removed in 0.87.
╰────
help: Use `null` instead
```
# Tests + Formatting
tests have been updated, especially
- `nothing_fails_string`
- `nothing_fails_int`
which use a variable called `nil` now to make sure `nothing` does not
support cell paths 👍
# After Submitting
classic deprecation mention 👍
Factor the big parts into separate files:
- `state_delta.rs`
- `state_working_set.rs`
- smaller `usage.rs`
This required adjusting the visibility of several parts.
Makes `StateDelta` transparent for the module.
Trying to reduce visibility in some other places
Elide the reference for `Copy` type (`usize`)
Use the canonical deref where possible.
* `&Box` -> `&`
* `&String` -> `&str`
* `&PathBuf` -> `&Path`
Skips the ctrl-C handler for now.
# Description
This generally makes for nicer APIs, as you are not forced to use an
existing allocation covering the full `String`.
Some exceptions remain where the underlying type requirements favor it.
# User-Facing Changes
None
# Description
This removes pipeline element profiling. This could be a useful feature,
but pipeline elements are going to be the most sensitive to in terms of
performance, as `eval_block` and how pipelines are built is one of the
hot loops inside of the eval engine.
# User-Facing Changes
Removes pipeline element profiling.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
As part of the refactor to split spans off of Value, this moves to using
helper functions to create values, and using `.span()` instead of
matching span out of Value directly.
Hoping to get a few more helping hands to finish this, as there are a
lot of commands to update :)
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
---------
Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
Co-authored-by: WindSoilder <windsoilder@outlook.com>
# Description
This doesn't really do much that the user could see, but it helps get us
ready to do the steps of the refactor to split the span off of Value, so
that values can be spanless. This allows us to have top-level values
that can hold both a Value and a Span, without requiring that all values
have them.
We expect to see significant memory reduction by removing so many
unnecessary spans from values. For example, a table of 100,000 rows and
5 columns would have a savings of ~8megs in just spans that are almost
always duplicated.
# User-Facing Changes
Nothing yet
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
This PR creates a new `Record` type to reduce duplicate code and
possibly bugs as well. (This is an edited version of #9648.)
- `Record` implements `FromIterator` and `IntoIterator` and so can be
iterated over or collected into. For example, this helps with
conversions to and from (hash)maps. (Also, no more
`cols.iter().zip(vals)`!)
- `Record` has a `push(col, val)` function to help insure that the
number of columns is equal to the number of values. I caught a few
potential bugs thanks to this (e.g. in the `ls` command).
- Finally, this PR also adds a `record!` macro that helps simplify
record creation. It is used like so:
```rust
record! {
"key1" => some_value,
"key2" => Value::string("text", span),
"key3" => Value::int(optional_int.unwrap_or(0), span),
"key4" => Value::bool(config.setting, span),
}
```
Since macros hinder formatting, etc., the right hand side values should
be relatively short and sweet like the examples above.
Where possible, prefer `record!` or `.collect()` on an iterator instead
of multiple `Record::push`s, since the first two automatically set the
record capacity and do less work overall.
# User-Facing Changes
Besides the changes in `nu-protocol` the only other breaking changes are
to `nu-table::{ExpandedTable::build_map, JustTable::kv_table}`.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
https://github.com/nushell/nushell/pull/9773 introduced constants to
modules and allowed to export them, but only within one level. This PR:
* allows recursive exporting of constants from all submodules
* fixes submodule imports in a list import pattern
* makes sure exported constants are actual constants
Should unblock https://github.com/nushell/nushell/pull/9678
### Example:
```nushell
module spam {
export module eggs {
export module bacon {
export const viking = 'eats'
}
}
}
use spam
print $spam.eggs.bacon.viking # prints 'eats'
use spam [eggs]
print $eggs.bacon.viking # prints 'eats'
use spam eggs bacon viking
print $viking # prints 'eats'
```
### Limitation 1:
Considering the above `spam` module, attempting to get `eggs bacon` from
`spam` module doesn't work directly:
```nushell
use spam [ eggs bacon ] # attempts to load `eggs`, then `bacon`
use spam [ "eggs bacon" ] # obviously wrong name for a constant, but doesn't work also for commands
```
Workaround (for example):
```nushell
use spam eggs
use eggs [ bacon ]
print $bacon.viking # prints 'eats'
```
I'm thinking I'll just leave it in, as you can easily work around this.
It is also a limitation of the import pattern in general, not just
constants.
### Limitation 2:
`overlay use` successfully imports the constants, but `overlay hide`
does not hide them, even though it seems to hide normal variables
successfully. This needs more investigation.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Allows recursive constant exports from submodules.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
In the past we named the process of completely removing a command and
providing a basic error message pointing to the new alternative
"deprecation".
But this doesn't match the expectation of most users that have seen
deprecation _warnings_ that alert to either impending removal or
discouraged use after a stability promise.
# User-Facing Changes
Command category changed from `deprecated` to `removed`
# Description
This PR does three related changes:
* Keeps the originally declared name in help outputs.
* Updates the name of the commands called `main` in the user script to
the name of the script.
* Fixes the source of signature information in multiple places. This
allows scripts to have more complete help output.
Combined, the above allow the user to see the script name in the help
output of scripts, like so:

NOTE: You still declare and call the definition `main`, so from inside
the script `main` is still the correct name. But multiple folks agreed
that seeing `main` in the script help was confusing, so this PR changes
that.
# User-Facing Changes
One potential minor breaking change is that module renames will be shown
as their originally defined name rather than the renamed name. I believe
this to be a better default.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Relative: #8248
After this pr, user can define const variable inside a module.

And user can export const variables, the following screenshot shows how
it works (it follows
https://github.com/nushell/nushell/issues/8248#issuecomment-1637442612):

## About the change
1. To make module support const, we need to change `parse_module_block`
to support `const` keyword.
2. To suport export `const`, we need to make module tracking variables,
so we add `variables` attribute to `Module`
3. During eval, the const variable may not exists in `stack`, because we
don't eval `const` when we define a module, so we need to find variables
which are already registered in `engine_state`
## One more thing to note about the const value.
Consider the following code
```
module foo { const b = 3; export def bar [] { $b } }
use foo bar
const b = 4;
bar
```
The result will be 3 (which is defined in module) rather than 4. I think
it's expected behavior.
It's something like [dynamic
binding](https://www.gnu.org/software/emacs/manual/html_node/elisp/Dynamic-Binding-Tips.html)
vs [lexical
binding](https://www.gnu.org/software/emacs/manual/html_node/elisp/Lexical-Binding.html)
in lisp like language, and lexical binding should be right behavior
which generates more predicable result, and it doesn't introduce really
subtle bugs in nushell code.
What if user want dynamic-binding?(For example: the example code returns
`4`)
There is no way to do this, user should consider passing the value as
argument to custom command rather than const.
## TODO
- [X] adding tests for the feature.
- [X] support export const out of module to use.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
`Span` is `Copy`, so we probably should not be passing references of
`Span` around. This PR replaces all instances of `&Span` with `Span`,
copying spans where necessary.
# User-Facing Changes
This alters some public functions to take `Span` instead of `&Span` as
input. Namely, `EngineState::get_span_contents`,
`nu_protocol::extract_value`, a bunch of the math commands, and
`Gstat::gstat`.
# Description
This PR removes the compile-time overload system. Unfortunately, this
system never worked correctly because in a gradual type system where
types can be `Any`, you don't have enough information to correctly
resolve function calls with overloads. These resolutions must be done at
runtime, if they're supported.
That said, there's a bit of work that needs to go into resolving
input/output types (here overloads do not execute separate commands, but
the same command and each overload explains how each output type
corresponds to input types).
This PR also removes the type scope, which would give incorrect answers
in cases where multiple subexpressions were used in a pipeline.
# User-Facing Changes
Finishes removing compile-time overloads. These were only used in a few
places in the code base, but it's possible it may impact user code. I'll
mark this as breaking change so we can review.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
We previously simply searched all commands in the working set. As our
deprecated/removed subcommands are documented by stub commands that
don't do anything apart from providing a message, they were still
included.
With this change we check the `Signature.category` to not be
`Category::Deprecated`.
## Note on performance
Making this change will exercise `Command.signature()` more
frequently! As the rust-implemented commands include their builders here
this probably will cause a number of extra allocations. There is
actually no valid reason why the commands should construct a new
`Signature` for each call to `Command.signature()`.
This will introduce some overhead to generate the completions for
commands.
# User-Facing Changes
Example: `str <TAB>`

# Description
This PR does a few things to help improve type hovers and, in the
process, fixes a few outstanding issues in the type system. Here's a
list of the changes:
* `for` now will try to infer the type of the iteration variable based
on the expression it's given. This fixes things like `for x in [1, 2, 3]
{ }` where `x` now properly gets the int type.
* Removed old input/output type fields from the signature, focuses on
the vec of signatures. Updated a bunch of dataframe commands that hadn't
moved over. This helps tie things together a bit better
* Fixed inference of types from subexpressions to use the last
expression in the block
* Fixed handling of explicit types in `let` and `mut` calls, so we now
respect that as the authoritative type
I also tried to add `def` input/output type inference, but unfortunately
we only know the predecl types universally, which means we won't have
enough information to properly know what the types of the custom
commands are.
# User-Facing Changes
Script typechecking will get tighter in some cases
Hovers should be more accurate in some cases that previously resorted to
any.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- crates/nu-std/tests/run.nu` to run the tests for the
standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
---------
Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
This PR reverts https://github.com/nushell/nushell/pull/9391
We try not to revert PRs like this, though after discussion with the
Nushell team, we decided to revert this one.
The main reason is that Nushell, as a codebase, isn't ready for these
kinds of optimisations. It's in the part of the development cycle where
our main focus should be on improving the algorithms inside of Nushell
itself. Once we have matured our algorithms, then we can look for
opportunities to switch out technologies we're using for alternate
forms.
Much of Nushell still has lots of opportunities for tuning the codebase,
paying down technical debt, and making the codebase generally cleaner
and more robust. This should be the focus. Performance improvements
should flow out of that work.
Said another, optimisation that isn't part of tuning the codebase is
premature at this stage. We need to focus on doing the hard work of
making the engine, parser, etc better.
# User-Facing Changes
Reverts the HashMap -> ahash change.
cc @FilipAndersson245
# Description
see https://github.com/nushell/nushell/issues/9390
using `ahash` instead of the default hasher. this will not affect
compile time as we where already building `ahash`.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- crates/nu-std/tests/run.nu` to run the tests for the
standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Fixes the clippy warnings we're about to get hit with next time we
upgrade Rust.
The big one was shrinking ShellError and related under 128 bytes.
# User-Facing Changes
Shouldn't notice much difference. In theory, we could see a tiny perf
improvement, but I didn't notice one.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- crates/nu-std/tests/run.nu` to run the tests for the
standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->