# Description
Allows `Stack` to have a modified local `Config`, which is updated
immediately when `$env.config` is assigned to. This means that even
within a script, commands that come after `$env.config` changes will
always see those changes in `Stack::get_config()`.
Also fixed a lot of cases where `engine_state.get_config()` was used
even when `Stack` was available.
Closes#13324.
# User-Facing Changes
- Config changes apply immediately after the assignment is executed,
rather than whenever config is read by a command that needs it.
- Potentially slower performance when executing a lot of lines that
change `$env.config` one after another. Recommended to get `$env.config`
into a `mut` variable first and do modifications, then assign it back.
- Much faster performance when executing a script that made
modifications to `$env.config`, as the changes are only parsed once.
# Tests + Formatting
All passing.
# After Submitting
- [ ] release notes
# Description
This PR introduces a new `Signals` struct to replace our adhoc passing
around of `ctrlc: Option<Arc<AtomicBool>>`. Doing so has a few benefits:
- We can better enforce when/where resetting or triggering an interrupt
is allowed.
- Consolidates `nu_utils::ctrl_c::was_pressed` and other ad-hoc
re-implementations into a single place: `Signals::check`.
- This allows us to add other types of signals later if we want. E.g.,
exiting or suspension.
- Similarly, we can more easily change the underlying implementation if
we need to in the future.
- Places that used to have a `ctrlc` of `None` now use
`Signals::empty()`, so we can double check these usages for correctness
in the future.
# Description
fixed#12699
When bare dates or naive times are specified in toml files, `from toml`
returns invalid dates or times.
This PR fixes the problem to correctly handle toml datetime.
The current version command returns the default datetime
(`chrono::DateTime::default()`) if the datetime parse fails. However, I
felt that this behavior was a bit unfriendly, so I changed it to return
`Value::string`.
# User-Facing Changes
The command returns a date with default time and timezone if a bare date
is specified.
```
~/Development/nushell> "dob = 2023-05-27" | from toml
╭─────┬────────────╮
│ dob │ a year ago │
╰─────┴────────────╯
~/Development/nushell> "dob = 2023-05-27" | from toml |
Sat, 27 May 2023 00:00:00 +0000 (a year ago)
~/Development/nushell>
```
If a bare time is given, a time string is returned.
```
~/Development/nushell> "tm = 11:00:00" | from toml
╭────┬──────────╮
│ tm │ 11:00:00 │
╰────┴──────────╯
~/Development/nushell> "tm = 11:00:00" | from toml | get tm
11:00:00
~/Development/nushell>
```
# Tests + Formatting
When I ran tests, `commands::touch::change_file_mtime_to_reference`
failed with the following error.
The error also occurs in the master branch, so it's probably unrelated
to these changes.
(maybe a problem with my dev environment)
```
$ ~/Development/nushell> toolkit check pr
~~~~~~~~
test usage_start_uppercase ... ok
test format_conversions::yaml::convert_dict_to_yaml_with_integer_floats_key ... ok
test format_conversions::yaml::convert_dict_to_yaml_with_boolean_key ... ok
test format_conversions::yaml::table_to_yaml_text_and_from_yaml_text_back_into_table ... ok
test quickcheck_parse ... ok
test format_conversions::yaml::convert_dict_to_yaml_with_integer_key ... ok
failures:
---- commands::touch::change_file_mtime_to_reference stdout ----
=== stderr
thread 'commands::touch::change_file_mtime_to_reference' panicked at crates/nu-command/tests/commands/touch.rs:298:9:
assertion `left == right` failed
left: SystemTime { tv_sec: 1720344745, tv_nsec: 862392750 }
right: SystemTime { tv_sec: 1720344745, tv_nsec: 887670417 }
failures:
commands::touch::change_file_mtime_to_reference
test result: FAILED. 1542 passed; 1 failed; 32 ignored; 0 measured; 0 filtered out; finished in 12.04s
error: test failed, to rerun pass `-p nu-command --test main`
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🔴 `toolkit test`
- ⚫ `toolkit test stdlib`
~/Development/nushell> toolkit test stdlib
Compiling nu v0.95.1 (/Users/hiroki/Development/nushell)
Compiling nu-cmd-lang v0.95.1 (/Users/hiroki/Development/nushell/crates/nu-cmd-lang)
Finished dev [unoptimized + debuginfo] target(s) in 6.64s
Running `target/debug/nu --no-config-file -c '
use crates/nu-std/testing.nu
testing run-tests --path crates/nu-std
'`
2024-07-07T19:00:20.423|INF|Running from_jsonl_invalid_object in module test_formats
2024-07-07T19:00:20.436|INF|Running env_log-prefix in module test_logger_env
~~~~~~~~~~~
2024-07-07T19:00:22.196|INF|Running debug_short in module test_basic_commands
~/Development/nushell>
```
# After Submitting
nothing
This reverts commit 0cfd5fbece.
The original PR messed up syntax higlighting of aliases and causes
panics of completion in the presence of alias.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Part of https://github.com/nushell/nushell/issues/12963, step 2.
This PR refactors Call and related argument structures to remove their
dependency on `Expression::span` which will be removed in the future.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Should be none. If you see some error messages that look broken, please
report.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Provides the ability to use http commands as part of a pipeline.
Additionally, this pull requests extends the pipeline metadata to add a
content_type field. The content_type metadata field allows commands such
as `to json` to set the metadata in the pipeline allowing the http
commands to use it when making requests.
This pull request also introduces the ability to directly stream http
requests from streaming pipelines.
One other small change is that Content-Type will always be set if it is
passed in to the http commands, either indirectly or throw the content
type flag. Previously it was not preserved with requests that were not
of type json or form data.
# User-Facing Changes
* `http post`, `http put`, `http patch`, `http delete` can be used as
part of a pipeline
* `to text`, `to json`, `from json` all set the content_type metadata
field and the http commands will utilize them when making requests.
# Description
fixed#11678
The sub-commands of from command (`from {csv, tsv, ssv}`) name columns
starting from index 0.
This behaviour is inconsistent with other commands such as `detect
columns`.
This PR makes the subcommands index 0-based.
# User-Facing Changes
The subcommands (`from {csv, tsv, ssv}`) return a table with the columns
starting at index 0 if no header data is passed.
```
~/Development/nushell> "foo bar baz" | from ssv -n -m 1
╭───┬─────────┬─────────┬─────────╮
│ # │ column0 │ column1 │ column2 │
├───┼─────────┼─────────┼─────────┤
│ 0 │ foo │ bar │ baz │
╰───┴─────────┴─────────┴─────────╯
~/Development/nushell> "foo,bar,baz" | from csv -n
╭───┬─────────┬─────────┬─────────╮
│ # │ column0 │ column1 │ column2 │
├───┼─────────┼─────────┼─────────┤
│ 0 │ foo │ bar │ baz │
╰───┴─────────┴─────────┴─────────╯
~/Development/nushell> "foo\tbar\tbaz" | from tsv -n
╭───┬─────────┬─────────┬─────────╮
│ # │ column0 │ column1 │ column2 │
├───┼─────────┼─────────┼─────────┤
│ 0 │ foo │ bar │ baz │
╰───┴─────────┴─────────┴─────────╯
```
# Tests + Formatting
When I ran tests, `commands::touch::change_file_mtime_to_reference`
failed with the following error.
The error also occurs in the master branch, so it's probably unrelated
to these changes.
(maybe a problem with my dev environment)
```
$ toolkit check pr
~~~~~~~~
failures:
---- commands::touch::change_file_mtime_to_reference stdout ----
=== stderr
thread 'commands::touch::change_file_mtime_to_reference' panicked at crates/nu-command/tests/commands/touch.rs:298:9:
assertion `left == right` failed
left: SystemTime { tv_sec: 1719149697, tv_nsec: 57576929 }
right: SystemTime { tv_sec: 1719149697, tv_nsec: 78219489 }
failures:
commands::touch::change_file_mtime_to_reference
test result: FAILED. 1533 passed; 1 failed; 32 ignored; 0 measured; 0 filtered out; finished in 10.87s
error: test failed, to rerun pass `-p nu-command --test main`
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🔴 `toolkit test`
- ⚫ `toolkit test stdlib`
```
# After Submitting
nothing
# Description
Makes `to toml` use the `toml::value::Datetime` type, so that `to toml`
serializes dates properly.
# User-Facing Changes
`to toml` will now encode dates differently, in a native format instead
of a string. This could, in theory, break some workflows:
```Nushell
# Before:
~> {datetime: 2024-05-31} | to toml | from toml | get datetime | into datetime
Fri, 31 May 2024 00:00:00 +0000 (10 hours ago)
# After:
~> {datetime: 2024-05-31} | to toml | from toml | get datetime | into datetime
Error: nu:🐚:only_supports_this_input_type
× Input type not supported.
╭─[entry #13:1:36]
1 │ {datetime: 2024-05-31} | to toml | from toml | get datetime | into datetime
· ────┬──── ──────┬──────
· │ ╰── only string and int input data is supported
· ╰── input type: date
╰────
```
Fix#11751
# Description
Enable the `preserve_order` feature of the `toml` crate to preserve the
ordering of elements when converting from/to toml.
Additionally, use `to_string_pretty()` instead of `to_string()` in `to
toml`. This displays arrays on multiple lines instead of one big single
line. I'm not sure if this one is a good idea or not... Happy to remove
this from this PR if it's not.
# User-Facing Changes
The order of elements will be different when using `from toml`. The
formatting of arrays will also be different when using `to toml`. For
example:
- before
```
❯ "foo=1\nbar=2\ndoo=3" | from toml
╭─────┬───╮
│ bar │ 2 │
│ doo │ 3 │
│ foo │ 1 │
╰─────┴───╯
❯ {a: [a b c d]} | to toml
a = ["a", "b", "c", "d"]
```
- after
```
❯ "foo=1\nbar=2\ndoo=3" | from toml
╭─────┬───╮
│ foo │ 1 │
│ bar │ 2 │
│ doo │ 3 │
╰─────┴───╯
❯ {a: [a b c d]} | to toml
a = [
"a",
"b",
"c",
"d",
]
```
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🔴 `toolkit test`
- ⚫ `toolkit test stdlib`
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Makes the `from json --objects` command produce a stream, and read
lazily from an input stream to produce its output.
Also added a helper, `PipelineData::get_type()`, to make it easier to
construct a wrong type error message when matching on `PipelineData`. I
expect checking `PipelineData` for either a string value or an `Unknown`
or `String` typed `ByteStream` will be very, very common. I would have
liked to have a helper that just returns a readable stream from either,
but that would either be a bespoke enum or a `Box<dyn BufRead>`, which
feels like it wouldn't be so great for performance. So instead, taking
the approach I did here is probably better - having a function that
accepts the `impl BufRead` and matching to use it.
# User-Facing Changes
- `from json --objects` no longer collects its input, and can be used
for large datasets or streams that produce values over time.
# Tests + Formatting
All passing.
# After Submitting
- [ ] release notes
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
# Description
Implements streaming for:
- `from csv`
- `from tsv`
- `to csv`
- `to tsv`
via the new string-typed ByteStream support.
# User-Facing Changes
Commands above. Also:
- `to csv` and `to tsv` now have `--columns <List(String)>`, to provide
the exact columns desired in the output. This is required for them to
have streaming output, because otherwise collecting the entire list is
necessary to determine the output columns. If we introduce
`TableStream`, this may become less necessary.
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] release notes
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
# Description
This PR allows byte streams to optionally be colored as being
specifically binary or string data, which guarantees that they'll be
converted to `Binary` or `String` appropriately on `into_value()`,
making them compatible with `Type` guarantees. This makes them
significantly more broadly usable for command input and output.
There is still an `Unknown` type for byte streams coming from external
commands, which uses the same behavior as we previously did where it's a
string if it's UTF-8.
A small number of commands were updated to take advantage of this, just
to prove the point. I will be adding more after this merges.
# User-Facing Changes
- New types in `describe`: `string (stream)`, `binary (stream)`
- These commands now return a stream if their input was a stream:
- `into binary`
- `into string`
- `bytes collect`
- `str join`
- `first` (binary)
- `last` (binary)
- `take` (binary)
- `skip` (binary)
- Streams that are explicitly binary colored will print as a streaming
hexdump
- example:
```nushell
1.. | each { into binary } | bytes collect
```
# Tests + Formatting
I've added some tests to cover it at a basic level, and it doesn't break
anything existing, but I do think more would be nice. Some of those will
come when I modify more commands to stream.
# After Submitting
There are a few things I'm not quite satisfied with:
- **String trimming behavior.** We automatically trim newlines from
streams from external commands, but I don't think we should do this with
internal commands. If I call a command that happens to turn my string
into a stream, I don't want the newline to suddenly disappear. I changed
this to specifically do it only on `Child` and `File`, but I don't know
if this is quite right, and maybe we should bring back the old flag for
`trim_end_newline`
- **Known binary always resulting in a hexdump.** It would be nice to
have a `print --raw`, so that we can put binary data on stdout
explicitly if we want to. This PR doesn't change how external commands
work though - they still dump straight to stdout.
Otherwise, here's the normal checklist:
- [ ] release notes
- [ ] docs update for plugin protocol changes (added `type` field)
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
# Description
Changes `get_full_help` to take a `&dyn Command` instead of multiple
arguments (`&Signature`, `&Examples` `is_parser_keyword`). All of these
arguments can be gathered from a `Command`, so there is no need to pass
the pieces to `get_full_help`.
This PR also fixes an issue where the search terms are not shown if
`--help` is used on a command.
# Description
Kind of a vague title, but this PR does two main things:
1. Rather than overriding functions like `Command::is_parser_keyword`,
this PR instead changes commands to override `Command::command_type`.
The `CommandType` returned by `Command::command_type` is then used to
automatically determine whether `Command::is_parser_keyword` and the
other `is_{type}` functions should return true. These changes allow us
to remove the `CommandType::Other` case and should also guarantee than
only one of the `is_{type}` functions on `Command` will return true.
2. Uses the new, reworked `Command::command_type` function in the `scope
commands` and `which` commands.
# User-Facing Changes
- Breaking change for `scope commands`: multiple columns (`is_builtin`,
`is_keyword`, `is_plugin`, etc.) have been merged into the `type`
column.
- Breaking change: the `which` command can now report `plugin` or
`keyword` instead of `built-in` in the `type` column. It may also now
report `external` instead of `custom` in the `type` column for known
`extern`s.
# Description
This PR introduces a `ByteStream` type which is a `Read`-able stream of
bytes. Internally, it has an enum over three different byte stream
sources:
```rust
pub enum ByteStreamSource {
Read(Box<dyn Read + Send + 'static>),
File(File),
Child(ChildProcess),
}
```
This is in comparison to the current `RawStream` type, which is an
`Iterator<Item = Vec<u8>>` and has to allocate for each read chunk.
Currently, `PipelineData::ExternalStream` serves a weird dual role where
it is either external command output or a wrapper around `RawStream`.
`ByteStream` makes this distinction more clear (via `ByteStreamSource`)
and replaces `PipelineData::ExternalStream` in this PR:
```rust
pub enum PipelineData {
Empty,
Value(Value, Option<PipelineMetadata>),
ListStream(ListStream, Option<PipelineMetadata>),
ByteStream(ByteStream, Option<PipelineMetadata>),
}
```
The PR is relatively large, but a decent amount of it is just repetitive
changes.
This PR fixes#7017, fixes#10763, and fixes#12369.
This PR also improves performance when piping external commands. Nushell
should, in most cases, have competitive pipeline throughput compared to,
e.g., bash.
| Command | Before (MB/s) | After (MB/s) | Bash (MB/s) |
| -------------------------------------------------- | -------------:|
------------:| -----------:|
| `throughput \| rg 'x'` | 3059 | 3744 | 3739 |
| `throughput \| nu --testbin relay o> /dev/null` | 3508 | 8087 | 8136 |
# User-Facing Changes
- This is a breaking change for the plugin communication protocol,
because the `ExternalStreamInfo` was replaced with `ByteStreamInfo`.
Plugins now only have to deal with a single input stream, as opposed to
the previous three streams: stdout, stderr, and exit code.
- The output of `describe` has been changed for external/byte streams.
- Temporary breaking change: `bytes starts-with` no longer works with
byte streams. This is to keep the PR smaller, and `bytes ends-with`
already does not work on byte streams.
- If a process core dumped, then instead of having a `Value::Error` in
the `exit_code` column of the output returned from `complete`, it now is
a `Value::Int` with the negation of the signal number.
# After Submitting
- Update docs and book as necessary
- Release notes (e.g., plugin protocol changes)
- Adapt/convert commands to work with byte streams (high priority is
`str length`, `bytes starts-with`, and maybe `bytes ends-with`).
- Refactor the `tee` code, Devyn has already done some work on this.
---------
Co-authored-by: Devyn Cairns <devyn.cairns@gmail.com>
# Description
Does some misc changes to `ListStream`:
- Moves it into its own module/file separate from `RawStream`.
- `ListStream`s now have an associated `Span`.
- This required changes to `ListStreamInfo` in `nu-plugin`. Note sure if
this is a breaking change for the plugin protocol.
- Hides the internals of `ListStream` but also adds a few more methods.
- This includes two functions to more easily alter a stream (these take
a `ListStream` and return a `ListStream` instead of having to go through
the whole `into_pipeline_data(..)` route).
- `map`: takes a `FnMut(Value) -> Value`
- `modify`: takes a function to modify the inner stream.
# Description
Judiciously try to avoid allocations/clone by changing the signature of
functions
- **Don't pass str by value unnecessarily if only read**
- **Don't require a vec in `Sandbox::with_files`**
- **Remove unnecessary string clone**
- **Fixup unnecessary borrow**
- **Use `&str` in shape color instead**
- **Vec -> Slice**
- **Elide string clone**
- **Elide `Path` clone**
- **Take &str to elide clone in tests**
# User-Facing Changes
None
# Tests + Formatting
This touches many tests purely in changing from owned to borrowed/static
data
# Description
Removes lazy records from the language, following from the reasons
outlined in #12622. Namely, this should make semantics more clear and
will eliminate concerns regarding maintainability.
# User-Facing Changes
- Breaking change: `lazy make` is removed.
- Breaking change: `describe --collect-lazyrecords` flag is removed.
- `sys` and `debug info` now return regular records.
# After Submitting
- Update nushell book if necessary.
- Explore new `sys` and `debug info` APIs to prevent them from taking
too long (e.g., subcommands or taking an optional column/cell-path
argument).
# Description
I thought about bringing `nu_plugin_msgpack` in, but that is MPL with a
clause that prevents other licenses, so rather than adapt that code I
decided to take a crack at just doing it straight from `rmp` to `Value`
without any `rmpv` in the middle. It seems like it's probably faster,
though I can't say for sure how much with the plugin overhead.
@IanManske I started on a `Read` implementation for `RawStream` but just
specialized to `from msgpack` here, but I'm thinking after release maybe
we can polish it up and make it a real one. It works!
# User-Facing Changes
New commands:
- `from msgpack`
- `from msgpackz`
- `to msgpack`
- `to msgpackz`
# Tests + Formatting
Pretty thorough tests added for the format deserialization, with a
roundtrip for serialization. Some example tests too for both `from
msgpack` and `to msgpack`.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] update release notes
# Description
Continuing from #12568, this PR further reduces the size of `Expr` from
64 to 40 bytes. It also reduces `Expression` from 128 to 96 bytes and
`Type` from 32 to 24 bytes.
This was accomplished by:
- for `Expr` with multiple fields (e.g., `Expr::Thing(A, B, C)`),
merging the fields into new AST struct types and then boxing this struct
(e.g. `Expr::Thing(Box<ABC>)`).
- replacing `Vec<T>` with `Box<[T]>` in multiple places. `Expr`s and
`Expression`s should rarely be mutated, if at all, so this optimization
makes sense.
By reducing the size of these types, I didn't notice a large performance
improvement (at least compared to #12568). But this PR does reduce the
memory usage of nushell. My config is somewhat light so I only noticed a
difference of 1.4MiB (38.9MiB vs 37.5MiB).
---------
Co-authored-by: Stefan Holderbach <sholderbach@users.noreply.github.com>
# Description
`Value` describes the types of first-class values that users and scripts
can create, manipulate, pass around, and store. However, `Block`s are
not first-class values in the language, so this PR removes it from
`Value`. This removes some unnecessary code, and this change should be
invisible to the user except for the change to `scope modules` described
below.
# User-Facing Changes
Breaking change: the output of `scope modules` was changed so that
`env_block` is now `has_env_block` which is a boolean value instead of a
`Block`.
# After Submitting
Update the language guide possibly.
# Description
in order to change the style of the _serialized_ NUON data,
`nuon::to_nuon` takes three mutually exclusive arguments, `raw: bool`,
`tabs: Option<usize>` and `indent: Option<usize>` 🤔
this begs to use an enumeration with all possible alternatives, right?
this PR changes the signature of `nuon::to_nuon` to use `nuon::ToStyle`
which has three variants
- `Raw`: no newlines
- `Tabs(n: usize)`: newlines and `n` tabulations as indent
- `Spaces(n: usize)`: newlines and `n` spaces as indent
# User-Facing Changes
the signature of `nuon::to_nuon` changes from
```rust
to_nuon(
input: &Value,
raw: bool,
tabs: Option<usize>,
indent: Option<usize>,
span: Option<Span>,
) -> Result<String, ShellError>
```
to
```rust
to_nuon(
input: &Value,
style: ToStyle,
span: Option<Span>
) -> Result<String, ShellError>
```
# Tests + Formatting
# After Submitting
# Description
playing with the NUON format in Rust code in some plugins, we agreed
with the team it was a great time to create a standalone NUON format to
allow Rust devs to use this Nushell file format.
> **Note**
> this PR almost copy-pastes the code from
`nu_commands/src/formats/from/nuon.rs` and
`nu_commands/src/formats/to/nuon.rs` to `nuon/src/from.rs` and
`nuon/src/to.rs`, with minor tweaks to make then standalone functions,
e.g. remove the rest of the command implementations
### TODO
- [x] add tests
- [x] add documentation
# User-Facing Changes
devs will have access to a new crate, `nuon`, and two functions,
`from_nuon` and `to_nuon`
```rust
from_nuon(
input: &str,
span: Option<Span>,
) -> Result<Value, ShellError>
```
```rust
to_nuon(
input: &Value,
raw: bool,
tabs: Option<usize>,
indent: Option<usize>,
span: Option<Span>,
) -> Result<String, ShellError>
```
# Tests + Formatting
i've basically taken all the tests from
`crates/nu-command/tests/format_conversions/nuon.rs` and converted them
to use `from_nuon` and `to_nuon` instead of Nushell commands
- i've created a `nuon_end_to_end` to run both conversions with an
optional middle value to check that all is fine
> **Note**
> the `nuon::tests::read_code_should_fail_rather_than_panic` test does
give different results locally and in the CI...
> i've left it ignored with comments to help future us :)
# After Submitting
mention that in the release notes for sure!!
# Description
This PR adds a `ListItem` enum to our set of AST types. It encodes the
two possible expressions inside of list expression: a singular item or a
spread. This is similar to the existing `RecordItem` enum. Adding
`ListItem` allows us to remove the existing `Expr::Spread` case which
was previously used for list spreads. As a consequence, this guarantees
(via the type system) that spreads can only ever occur inside lists,
records, or as command args.
This PR also does a little bit of cleanup in relevant parser code.
# Description
This adds a `SharedCow` type as a transparent copy-on-write pointer that
clones to unique on mutate.
As an initial test, the `Record` within `Value::Record` is shared.
There are some pretty big wins for performance. I'll post benchmark
results in a comment. The biggest winner is nested access, as that would
have cloned the records for each cell path follow before and it doesn't
have to anymore.
The reusability of the `SharedCow` type is nice and I think it could be
used to clean up the previous work I did with `Arc` in `EngineState`.
It's meant to be a mostly transparent clone-on-write that just clones on
`.to_mut()` or `.into_owned()` if there are actually multiple
references, but avoids cloning if the reference is unique.
# User-Facing Changes
- `Value::Record` field is a different type (plugin authors)
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] use for `EngineState`
- [ ] use for `Value::List`
# Description
Currently, `Range` is a struct with a `from`, `to`, and `incr` field,
which are all type `Value`. This PR changes `Range` to be an enum over
`IntRange` and `FloatRange` for better type safety / stronger compile
time guarantees.
Fixes: #11778Fixes: #11777Fixes: #11776Fixes: #11775Fixes: #11774Fixes: #11773Fixes: #11769.
# User-Facing Changes
Hopefully none, besides bug fixes.
Although, the `serde` representation might have changed.
# Description
Again avoid uses of the `Record` internals, so we are free to change the
data layout
- **Don't use internals of `Record` in `into sqlite`**
- **Don't use internals of `Record` in `to xml`**
Remaining: `rename`
# User-Facing Changes
None
# Description
The second `Value` is redundant and will consume five extra bytes on
each transmission of a custom value to/from a plugin.
# User-Facing Changes
This is a breaking change to the plugin protocol.
The [example in the protocol
reference](https://www.nushell.sh/contributor-book/plugin_protocol_reference.html#value)
becomes
```json
{
"Custom": {
"val": {
"type": "PluginCustomValue",
"name": "database",
"data": [36, 190, 127, 40, 12, 3, 46, 83],
"notify_on_drop": true
},
"span": {
"start": 320,
"end": 340
}
}
}
```
instead of
```json
{
"CustomValue": {
...
}
}
```
# After Submitting
Update plugin protocol reference
# Description
When implementing a `Command`, one must also import all the types
present in the function signatures for `Command`. This makes it so that
we often import the same set of types in each command implementation
file. E.g., something like this:
```rust
use nu_protocol::ast::Call;
use nu_protocol::engine::{Command, EngineState, Stack};
use nu_protocol::{
record, Category, Example, IntoInterruptiblePipelineData, IntoPipelineData, PipelineData,
ShellError, Signature, Span, Type, Value,
};
```
This PR adds the `nu_engine::command_prelude` module which contains the
necessary and commonly used types to implement a `Command`:
```rust
// command_prelude.rs
pub use crate::CallExt;
pub use nu_protocol::{
ast::{Call, CellPath},
engine::{Command, EngineState, Stack},
record, Category, Example, IntoInterruptiblePipelineData, IntoPipelineData, IntoSpanned,
PipelineData, Record, ShellError, Signature, Span, Spanned, SyntaxShape, Type, Value,
};
```
This should reduce the boilerplate needed to implement a command and
also gives us a place to track the breadth of the `Command` API. I tried
to be conservative with what went into the prelude modules, since it
might be hard/annoying to remove items from the prelude in the future.
Let me know if something should be included or excluded.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Boxes `Record` inside `Value` to reduce memory usage, `Value` goes from
`72` -> `56` bytes after this change.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
closes#12115
# Description
This fix addresses a bug where the --tabs flag couldn't be utilized due
to improper handling of the tab quantity provided by the user.
Previously, the code mistakenly attempted to convert the tab quantity to
a boolean value, leading to a conversion error. The resolution involves
adjusting the condition clauses to properly validate the presence of the
flag's value. Now, the code checks whether the get_flag() function
returns a value or None associated with the --tabs flag. This adjustment
enables the --tabs flag to function correctly, triggering the
appropriate condition and allowing the conversion to proceed as
expected. Similarly, the fix applies to the --indent flag. Additionally,
a default case was added, and the conversion now works properly without
flags. Two tests were added to validate the corrected behavior of these
flags.
# User-Facing Changes
Now the conversion should work properly instead of displaying an error.
# Tests + Formatting
-🟢 toolkit fmt
-🟢 toolkit clippy
-🟢 toolkit test
-🟢 toolkit test stdlib
To run added tests:
- cargo test --package nu-command --test main --
format_conversions::json::test_tabs_indent_flag
- cargo test --package nu-command --test main --
format_conversions::json::test_indent_flag
# Description
This makes many of the larger objects in `EngineState` into `Arc`, and
uses `Arc::make_mut` to do clone-on-write if the reference is not
unique. This is generally very cheap, giving us the best of both worlds
- allowing us to mutate without cloning if we have an exclusive
reference, and cloning if we don't.
This started as more of a curiosity for me after remembering that
`Arc::make_mut` exists and can make using `Arc` for mostly immutable
data that sometimes needs to be changed very convenient, and also after
hearing someone complain about memory usage on Discord - this is a
somewhat significant win for that.
The exact objects that were wrapped in `Arc`:
- `files`, `file_contents` - the strings and byte buffers
- `decls` - the whole `Vec`, but mostly to avoid lots of individual
`malloc()` calls on Clone rather than for memory usage
- `blocks` - the blocks themselves, rather than the outer Vec
- `modules` - the modules themselves, rather than the outer Vec
- `env_vars`, `previous_env_vars` - the entire maps
- `config`
The changes required were relatively minimal, but this is a breaking API
change. In particular, blocks are added as Arcs, to allow the parser
cache functionality to work.
With my normal nu config, running on Linux, this saves me about 15 MiB
of process memory usage when running interactively (65 MiB → 50 MiB).
This also makes quick command executions cheaper, particularly since
every REPL loop now involves a clone of the engine state so that we can
recover from a panic. It also reduces memory usage where engine state
needs to be cloned and sent to another thread or kept within an
iterator.
# User-Facing Changes
Shouldn't be any, since it's all internal stuff, but it does change some
public interfaces so it's a breaking change
[Context on
Discord](https://discord.com/channels/601130461678272522/855947301380947968/1219425984990806207)
# Description
- Rename `CustomValue::value_string()` to `type_name()` to reflect its
usage better.
- Change print behavior to always call `to_base_value()` first, to give
the custom value better control over the output.
- Change `describe --detailed` to show the type name as the subtype,
rather than trying to describe the base value.
- Change custom `Type` to use `type_name()` rather than `typetag_name()`
to make things like `PluginCustomValue` more transparent
One question: should `describe --detailed` still include a description
of the base value somewhere? I'm torn on it, it seems possibly useful
for some things (maybe sqlite databases?), but having `describe -d` not
include the custom type name anywhere felt weird. Another option would
be to add another method to `CustomValue` for info to be displayed in
`describe`, so that it can be more type-specific?
# User-Facing Changes
Everything above has implications for printing and `describe` on custom
values
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# Description
The PR overhauls how IO redirection is handled, allowing more explicit
and fine-grain control over `stdout` and `stderr` output as well as more
efficient IO and piping.
To summarize the changes in this PR:
- Added a new `IoStream` type to indicate the intended destination for a
pipeline element's `stdout` and `stderr`.
- The `stdout` and `stderr` `IoStream`s are stored in the `Stack` and to
avoid adding 6 additional arguments to every eval function and
`Command::run`. The `stdout` and `stderr` streams can be temporarily
overwritten through functions on `Stack` and these functions will return
a guard that restores the original `stdout` and `stderr` when dropped.
- In the AST, redirections are now directly part of a `PipelineElement`
as a `Option<Redirection>` field instead of having multiple different
`PipelineElement` enum variants for each kind of redirection. This
required changes to the parser, mainly in `lite_parser.rs`.
- `Command`s can also set a `IoStream` override/redirection which will
apply to the previous command in the pipeline. This is used, for
example, in `ignore` to allow the previous external command to have its
stdout redirected to `Stdio::null()` at spawn time. In contrast, the
current implementation has to create an os pipe and manually consume the
output on nushell's side. File and pipe redirections (`o>`, `e>`, `e>|`,
etc.) have precedence over overrides from commands.
This PR improves piping and IO speed, partially addressing #10763. Using
the `throughput` command from that issue, this PR gives the following
speedup on my setup for the commands below:
| Command | Before (MB/s) | After (MB/s) | Bash (MB/s) |
| --------------------------- | -------------:| ------------:|
-----------:|
| `throughput o> /dev/null` | 1169 | 52938 | 54305 |
| `throughput \| ignore` | 840 | 55438 | N/A |
| `throughput \| null` | Error | 53617 | N/A |
| `throughput \| rg 'x'` | 1165 | 3049 | 3736 |
| `(throughput) \| rg 'x'` | 810 | 3085 | 3815 |
(Numbers above are the median samples for throughput)
This PR also paves the way to refactor our `ExternalStream` handling in
the various commands. For example, this PR already fixes the following
code:
```nushell
^sh -c 'echo -n "hello "; sleep 0; echo "world"' | find "hello world"
```
This returns an empty list on 0.90.1 and returns a highlighted "hello
world" on this PR.
Since the `stdout` and `stderr` `IoStream`s are available to commands
when they are run, then this unlocks the potential for more convenient
behavior. E.g., the `find` command can disable its ansi highlighting if
it detects that the output `IoStream` is not the terminal. Knowing the
output streams will also allow background job output to be redirected
more easily and efficiently.
# User-Facing Changes
- External commands returned from closures will be collected (in most
cases):
```nushell
1..2 | each {|_| nu -c "print a" }
```
This gives `["a", "a"]` on this PR, whereas this used to print "a\na\n"
and then return an empty list.
```nushell
1..2 | each {|_| nu -c "print -e a" }
```
This gives `["", ""]` and prints "a\na\n" to stderr, whereas this used
to return an empty list and print "a\na\n" to stderr.
- Trailing new lines are always trimmed for external commands when
piping into internal commands or collecting it as a value. (Failure to
decode the output as utf-8 will keep the trailing newline for the last
binary value.) In the current nushell version, the following three code
snippets differ only in parenthesis placement, but they all also have
different outputs:
1. `1..2 | each { ^echo a }`
```
a
a
╭────────────╮
│ empty list │
╰────────────╯
```
2. `1..2 | each { (^echo a) }`
```
╭───┬───╮
│ 0 │ a │
│ 1 │ a │
╰───┴───╯
```
3. `1..2 | (each { ^echo a })`
```
╭───┬───╮
│ 0 │ a │
│ │ │
│ 1 │ a │
│ │ │
╰───┴───╯
```
But in this PR, the above snippets will all have the same output:
```
╭───┬───╮
│ 0 │ a │
│ 1 │ a │
╰───┴───╯
```
- All existing flags on `run-external` are now deprecated.
- File redirections now apply to all commands inside a code block:
```nushell
(nu -c "print -e a"; nu -c "print -e b") e> test.out
```
This gives "a\nb\n" in `test.out` and prints nothing. The same result
would happen when printing to stdout and using a `o>` file redirection.
- External command output will (almost) never be ignored, and ignoring
output must be explicit now:
```nushell
(^echo a; ^echo b)
```
This prints "a\nb\n", whereas this used to print only "b\n". This only
applies to external commands; values and internal commands not in return
position will not print anything (e.g., `(echo a; echo b)` still only
prints "b").
- `complete` now always captures stderr (`do` is not necessary).
# After Submitting
The language guide and other documentation will need to be updated.
- Fixes#11997
# Description
Fixes the issue that comments are not ignored in SSV formatted data.

# User-Facing Changes
If you have a comment in the beginning of SSV formatted data it is now
not included in the SSV table.
# Tests + Formatting
The PR adds one test in the ssv.rs file. All previous test-cases are
still passing. Clippy and Fmt have been ran.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
fixes#12006
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Process empty headers as well in `to md` command.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
This PR add date support when using the `open` command on a xlsx file,
and the using `from xlsx` on a xlsx file.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Currently dates in xlsx files are read as nulls, after this PR this
would be regular dates.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
This is a follow up to
https://github.com/nushell/nushell/pull/11621#issuecomment-1937484322
Also Fixes: #11838
## About the code change
It applys the same logic when we pass variables to external commands:
0487e9ffcb/crates/nu-command/src/system/run_external.rs (L162-L170)
That is: if user input dynamic things(like variables, sub-expression, or
string interpolation), it returns a quoted `NuPath`, then user input
won't be globbed
# User-Facing Changes
Given two input files: `a*c.txt`, `abc.txt`
* `let f = "a*c.txt"; rm $f` will remove one file: `a*c.txt`.
~* `let f = "a*c.txt"; rm --glob $f` will remove `a*c.txt` and
`abc.txt`~
* `let f: glob = "a*c.txt"; rm $f` will remove `a*c.txt` and `abc.txt`
## Rules about globbing with *variable*
Given two files: `a*c.txt`, `abc.txt`
| Cmd Type | example | Result |
| ----- | ------------------ | ------ |
| builtin | let f = "a*c.txt"; rm $f | remove `a*c.txt` |
| builtin | let f: glob = "a*c.txt"; rm $f | remove `a*c.txt` and
`abc.txt`
| builtin | let f = "a*c.txt"; rm ($f \| into glob) | remove `a*c.txt`
and `abc.txt`
| custom | def crm [f: glob] { rm $f }; let f = "a*c.txt"; crm $f |
remove `a*c.txt` and `abc.txt`
| custom | def crm [f: glob] { rm ($f \| into string) }; let f =
"a*c.txt"; crm $f | remove `a*c.txt`
| custom | def crm [f: string] { rm $f }; let f = "a*c.txt"; crm $f |
remove `a*c.txt`
| custom | def crm [f: string] { rm $f }; let f = "a*c.txt"; crm ($f \|
into glob) | remove `a*c.txt` and `abc.txt`
In general, if a variable is annotated with `glob` type, nushell will
expand glob pattern. Or else, we need to use `into | glob` to expand
glob pattern
# Tests + Formatting
Done
# After Submitting
I think `str glob-escape` command will be no-longer required. We can
remove it.
# Description
This PR renames the conversion functions on `Value` to be more consistent.
It follows the Rust [API guidelines](https://rust-lang.github.io/api-guidelines/naming.html#ad-hoc-conversions-follow-as_-to_-into_-conventions-c-conv) for ad-hoc conversions.
The conversion functions on `Value` now come in a few forms:
- `coerce_{type}` takes a `&Value` and attempts to convert the value to
`type` (e.g., `i64` are converted to `f64`). This is the old behavior of
some of the `as_{type}` functions -- these functions have simply been
renamed to better reflect what they do.
- The new `as_{type}` functions take a `&Value` and returns an `Ok`
result only if the value is of `type` (no conversion is attempted). The
returned value will be borrowed if `type` is non-`Copy`, otherwise an
owned value is returned.
- `into_{type}` exists for non-`Copy` types, but otherwise does not
attempt conversion just like `as_type`. It takes an owned `Value` and
always returns an owned result.
- `coerce_into_{type}` has the same relationship with `coerce_{type}` as
`into_{type}` does with `as_{type}`.
- `to_{kind}_string`: conversion to different string formats (debug,
abbreviated, etc.). Only two of the old string conversion functions were
removed, the rest have been renamed only.
- `to_{type}`: other conversion functions. Currently, only `to_path`
exists. (And `to_string` through `Display`.)
This table summaries the above:
| Form | Cost | Input Ownership | Output Ownership | Converts `Value`
case/`type` |
| ---------------------------- | ----- | --------------- |
---------------- | -------- |
| `as_{type}` | Cheap | Borrowed | Borrowed/Owned | No |
| `into_{type}` | Cheap | Owned | Owned | No |
| `coerce_{type}` | Cheap | Borrowed | Borrowed/Owned | Yes |
| `coerce_into_{type}` | Cheap | Owned | Owned | Yes |
| `to_{kind}_string` | Expensive | Borrowed | Owned | Yes |
| `to_{type}` | Expensive | Borrowed | Owned | Yes |
# User-Facing Changes
Breaking API change for `Value` in `nu-protocol` which is exposed as
part of the plugin API.
With this PR i try to resolve#11751
# Description
I am rather new to Rust so if anything is not the way it should be
please let me know.
As described in the title I just fixed the date conversion in the to and
from toml commands as i thought it would be a good first issue. The
example of the original issue will now work as follows:
```
~> {date: 2024-02-02} | to toml
date = "2024-02-02T00:00:00+00:00"
~> "dob = 1979-05-27T07:32:00-08:00" | from toml
╭─────┬───────────────────────────╮
│ dob │ 44 years ago |
╰─────┴───────────────────────────╯
```
The `from toml` command now returns a nushell date which is displayed as
`44 years ago` in this case.
# User-Facing Changes
none
# Tests + Formatting
all tests pass and formatting has been applied
---------
Co-authored-by: dannou812 <dannou281@gmail.com>
Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
# Description
Close: #9673Close: #8277Close: #10944
This pr introduces the following syntax:
1. `e>|`, pipe stderr to next command. Example: `$env.FOO=bar nu
--testbin echo_env_stderr FOO e>| str length`
2. `o+e>|` and `e+o>|`, pipe both stdout and stderr to next command,
example: `$env.FOO=bar nu --testbin echo_env_mixed out-err FOO FOO e+o>|
str length`
Note: it only works for external commands. ~There is no different for
internal commands, that is, the following three commands do the same
things:~ Edit: it raises errors if we want to pipes for internal
commands
```
❯ ls e>| str length
Error: × `e>|` only works with external streams
╭─[entry #1:1:1]
1 │ ls e>| str length
· ─┬─
· ╰── `e>|` only works on external streams
╰────
❯ ls e+o>| str length
Error: × `o+e>|` only works with external streams
╭─[entry #2:1:1]
1 │ ls e+o>| str length
· ──┬──
· ╰── `o+e>|` only works on external streams
╰────
```
This can help us to avoid some strange issues like the following:
`$env.FOO=bar (nu --testbin echo_env_stderr FOO) e>| str length`
Which is hard to understand and hard to explain to users.
# User-Facing Changes
Nan
# Tests + Formatting
To be done
# After Submitting
Maybe update documentation about these syntax.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Fixes https://github.com/nushell/nushell/issues/11716
The problem is in our [record creation
API](0d518bf813/crates/nu-protocol/src/value/record.rs (L33))
which panics if the numbers of columns and values are different. I added
a safe variant that returns a `Result` and used it in the `rotate`
command.
## TODO in another PR:
Go through all `from_raw_cols_vals_unchecked()` (this includes the
`record!` macro which uses the unchecked version) and make sure that
either
a) it is guaranteed the number of cols and vals is the same, or
b) convert the call to `from_raw_cols_vals()`
Reason: Nushell should never panic.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Adds the `--strict` flag for `from json` which will try to parse text
while following the exact JSON specification (e.g., no comments or
trailing commas allowed). Fixes issue #11548.
# Description
This pr is a follow up to
[#11569](https://github.com/nushell/nushell/pull/11569#issuecomment-1902279587)
> Revert the logic in https://github.com/nushell/nushell/pull/10694 and
apply the logic in this pr to mv, cp, rv will require a larger change, I
need to think how to achieve the bahavior
And sorry @bobhy for reverting some of your changes.
This pr is going to unify glob behavior on the given commands:
* open
* rm
* cp-old
* mv
* umv
* cp
* du
So they have the same behavior to `ls`, which is:
If given parameter is quoted by single quote(`'`) or double quote(`"`),
don't auto-expand the glob pattern. If not quoted, auto-expand the glob
pattern.
Fixes: #9558Fixes: #10211Fixes: #9310Fixes: #10364
# TODO
But there is one thing remains: if we give a variable to the command, it
will always auto-expand the glob pattern, e.g:
```nushell
let path = "a[123]b"
rm $path
```
I don't think it's expected. But I also think user might want to
auto-expand the glob pattern in variables.
So I'll introduce a new command called `glob escape`, then if user
doesn't want to auto-expand the glob pattern, he can just do this: `rm
($path | glob escape)`
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
Done
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
## NOTE
This pr changes the semantic of `GlobPattern`, before this pr, it will
`expand path` after evaluated, this makes `nu_engine::glob_from` have no
chance to glob things right if a path contains glob pattern.
e.g: [#9310
](https://github.com/nushell/nushell/issues/9310#issuecomment-1886824030)
#10211
I think changing the semantic is fine, because it makes glob works if
path contains something like '*'.
It maybe a breaking change if a custom command's argument are annotated
by `: glob`.
# Description
Fixes: #11455
### For arguments which is annotated with `:path/:directory/:glob`
To fix the issue, we need to have a way to know if a path is originally
quoted during runtime. So the information needed to be added at several
levels:
* parse time (from user input to expression)
We need to add quoted information into `Expr::Filepath`,
`Expr::Directory`, `Expr::GlobPattern`
* eval time
When convert from `Expr::Filepath`, `Expr::Directory`,
`Expr::GlobPattern` to `Value::String` during runtime, we won't auto
expanded the path if it's quoted
### For `ls`
It's really special, because it accepts a `String` as a pattern, and it
generates `glob` expression inside the command itself.
So the idea behind the change is introducing a special SyntaxShape to
ls: `SyntaxShape::LsGlobPattern`. So we can track if the pattern is
originally quoted easier, and we don't auto expand the path either.
Then when constructing a glob pattern inside ls, we check if input
pattern is quoted, if so: we escape the input pattern, so we can run `ls
a[123]b`, because it's already escaped.
Finally, to accomplish the checking process, we also need to introduce a
new value type called `Value::QuotedString` to differ from
`Value::String`, it's used to generate an enum called `NuPath`, which is
finally used in `ls` function. `ls` learned from `NuPath` to know if
user input is quoted.
# User-Facing Changes
Actually it contains several changes
### For arguments which is annotated with `:path/:directory/:glob`
#### Before
```nushell
> def foo [p: path] { echo $p }; print (foo "~/a"); print (foo '~/a')
/home/windsoilder/a
/home/windsoilder/a
> def foo [p: directory] { echo $p }; print (foo "~/a"); print (foo '~/a')
/home/windsoilder/a
/home/windsoilder/a
> def foo [p: glob] { echo $p }; print (foo "~/a"); print (foo '~/a')
/home/windsoilder/a
/home/windsoilder/a
```
#### After
```nushell
> def foo [p: path] { echo $p }; print (foo "~/a"); print (foo '~/a')
~/a
~/a
> def foo [p: directory] { echo $p }; print (foo "~/a"); print (foo '~/a')
~/a
~/a
> def foo [p: glob] { echo $p }; print (foo "~/a"); print (foo '~/a')
~/a
~/a
```
### For ls command
`touch '[uwu]'`
#### Before
```
❯ ls -D "[uwu]"
Error: × No matches found for [uwu]
╭─[entry #6:1:1]
1 │ ls -D "[uwu]"
· ───┬───
· ╰── Pattern, file or folder not found
╰────
help: no matches found
```
#### After
```
❯ ls -D "[uwu]"
╭───┬───────┬──────┬──────┬──────────╮
│ # │ name │ type │ size │ modified │
├───┼───────┼──────┼──────┼──────────┤
│ 0 │ [uwu] │ file │ 0 B │ now │
╰───┴───────┴──────┴──────┴──────────╯
```
# Tests + Formatting
Done
# After Submitting
NaN