# Description
`from ...` conversions pass along all metadata except `content_type`,
which they set to `None`.
## Rationale
`open`ing a file results in no `content_type` metadata if it can be
parsed into a nu data structure, and using `open --raw` results in
`content_type` metadata.
`from ...` commands should preserve metadata ***except*** for
`content_type`, as after parsing it's no longer that `content_type` and
just structured nu data.
These commands should return identical data *and* identical metadata
```nushell
open foo.csv
```
```nushell
open foo.csv --raw | from csv
```
# User-Facing Changes
N/A
# Tests + Formatting
- 🟢 toolkit fmt
- 🟢 toolkit clippy
- 🟢 toolkit test
- 🟢 toolkit test stdlib
# After Submitting
N/A
- fixes#14398
I will properly fill out this PR and fix any tests that might break when
I have the time, this was a quick fix.
# Description
This PR makes `from csv` and `from tsv`, with the `--flexible` flag,
stop dropping extra/unexpected columns.
# User-Facing Changes
`$text`'s contents
```csv
value
1,aaa
2,bbb
3
4,ddd
5,eee,extra
```
Old behavior
```nushell
> $text | from csv --flexible --noheaders
╭─#─┬─column0─╮
│ 0 │ value │
│ 1 │ 1 │
│ 2 │ 2 │
│ 3 │ 3 │
│ 4 │ 4 │
│ 5 │ 5 │
╰─#─┴─column0─╯
```
New behavior
```nushell
> $text | from csv --flexible --noheaders
╭─#─┬─column0─┬─column1─┬─column2─╮
│ 0 │ value │ ❎ │ ❎ │
│ 1 │ 1 │ aaa │ ❎ │
│ 2 │ 2 │ bbb │ ❎ │
│ 3 │ 3 │ ❎ │ ❎ │
│ 4 │ 4 │ ddd │ ❎ │
│ 5 │ 5 │ eee │ extra │
╰─#─┴─column0─┴─column1─┴─column2─╯
```
- The first line in a csv (or tsv) document no longer limits the number
of columns
- Missing values in columns are longer automatically filled with `null`
with this change, as a later row can introduce new columns. **BREAKING
CHANGE**
Because missing columns are different from empty columns, operations on
possibly missing columns will have to use optional access syntax e.g.
`get foo` => `get foo?`
# Tests + Formatting
Added examples that run as tests and adjusted existing tests to confirm
the new behavior.
# After Submitting
Update the workaround with fish completer mentioned
[here](https://www.nushell.sh/cookbook/external_completers.html#fish-completer)
# Description
The meaning of the word usage is specific to describing how a command
function is *used* and not a synonym for general description. Usage can
be used to describe the SYNOPSIS or EXAMPLES sections of a man page
where the permitted argument combinations are shown or example *uses*
are given.
Let's not confuse people and call it what it is a description.
Our `help` command already creates its own *Usage* section based on the
available arguments and doesn't refer to the description with usage.
# User-Facing Changes
`help commands` and `scope commands` will now use `description` or
`extra_description`
`usage`-> `description`
`extra_usage` -> `extra_description`
Breaking change in the plugin protocol:
In the signature record communicated with the engine.
`usage`-> `description`
`extra_usage` -> `extra_description`
The same rename also takes place for the methods on
`SimplePluginCommand` and `PluginCommand`
# Tests + Formatting
- Updated plugin protocol specific changes
# After Submitting
- [ ] update plugin protocol doc
# Description
This PR introduces a new `Signals` struct to replace our adhoc passing
around of `ctrlc: Option<Arc<AtomicBool>>`. Doing so has a few benefits:
- We can better enforce when/where resetting or triggering an interrupt
is allowed.
- Consolidates `nu_utils::ctrl_c::was_pressed` and other ad-hoc
re-implementations into a single place: `Signals::check`.
- This allows us to add other types of signals later if we want. E.g.,
exiting or suspension.
- Similarly, we can more easily change the underlying implementation if
we need to in the future.
- Places that used to have a `ctrlc` of `None` now use
`Signals::empty()`, so we can double check these usages for correctness
in the future.
# Description
fixed#12699
When bare dates or naive times are specified in toml files, `from toml`
returns invalid dates or times.
This PR fixes the problem to correctly handle toml datetime.
The current version command returns the default datetime
(`chrono::DateTime::default()`) if the datetime parse fails. However, I
felt that this behavior was a bit unfriendly, so I changed it to return
`Value::string`.
# User-Facing Changes
The command returns a date with default time and timezone if a bare date
is specified.
```
~/Development/nushell> "dob = 2023-05-27" | from toml
╭─────┬────────────╮
│ dob │ a year ago │
╰─────┴────────────╯
~/Development/nushell> "dob = 2023-05-27" | from toml |
Sat, 27 May 2023 00:00:00 +0000 (a year ago)
~/Development/nushell>
```
If a bare time is given, a time string is returned.
```
~/Development/nushell> "tm = 11:00:00" | from toml
╭────┬──────────╮
│ tm │ 11:00:00 │
╰────┴──────────╯
~/Development/nushell> "tm = 11:00:00" | from toml | get tm
11:00:00
~/Development/nushell>
```
# Tests + Formatting
When I ran tests, `commands::touch::change_file_mtime_to_reference`
failed with the following error.
The error also occurs in the master branch, so it's probably unrelated
to these changes.
(maybe a problem with my dev environment)
```
$ ~/Development/nushell> toolkit check pr
~~~~~~~~
test usage_start_uppercase ... ok
test format_conversions::yaml::convert_dict_to_yaml_with_integer_floats_key ... ok
test format_conversions::yaml::convert_dict_to_yaml_with_boolean_key ... ok
test format_conversions::yaml::table_to_yaml_text_and_from_yaml_text_back_into_table ... ok
test quickcheck_parse ... ok
test format_conversions::yaml::convert_dict_to_yaml_with_integer_key ... ok
failures:
---- commands::touch::change_file_mtime_to_reference stdout ----
=== stderr
thread 'commands::touch::change_file_mtime_to_reference' panicked at crates/nu-command/tests/commands/touch.rs:298:9:
assertion `left == right` failed
left: SystemTime { tv_sec: 1720344745, tv_nsec: 862392750 }
right: SystemTime { tv_sec: 1720344745, tv_nsec: 887670417 }
failures:
commands::touch::change_file_mtime_to_reference
test result: FAILED. 1542 passed; 1 failed; 32 ignored; 0 measured; 0 filtered out; finished in 12.04s
error: test failed, to rerun pass `-p nu-command --test main`
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🔴 `toolkit test`
- ⚫ `toolkit test stdlib`
~/Development/nushell> toolkit test stdlib
Compiling nu v0.95.1 (/Users/hiroki/Development/nushell)
Compiling nu-cmd-lang v0.95.1 (/Users/hiroki/Development/nushell/crates/nu-cmd-lang)
Finished dev [unoptimized + debuginfo] target(s) in 6.64s
Running `target/debug/nu --no-config-file -c '
use crates/nu-std/testing.nu
testing run-tests --path crates/nu-std
'`
2024-07-07T19:00:20.423|INF|Running from_jsonl_invalid_object in module test_formats
2024-07-07T19:00:20.436|INF|Running env_log-prefix in module test_logger_env
~~~~~~~~~~~
2024-07-07T19:00:22.196|INF|Running debug_short in module test_basic_commands
~/Development/nushell>
```
# After Submitting
nothing
This reverts commit 0cfd5fbece.
The original PR messed up syntax higlighting of aliases and causes
panics of completion in the presence of alias.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Part of https://github.com/nushell/nushell/issues/12963, step 2.
This PR refactors Call and related argument structures to remove their
dependency on `Expression::span` which will be removed in the future.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Should be none. If you see some error messages that look broken, please
report.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Provides the ability to use http commands as part of a pipeline.
Additionally, this pull requests extends the pipeline metadata to add a
content_type field. The content_type metadata field allows commands such
as `to json` to set the metadata in the pipeline allowing the http
commands to use it when making requests.
This pull request also introduces the ability to directly stream http
requests from streaming pipelines.
One other small change is that Content-Type will always be set if it is
passed in to the http commands, either indirectly or throw the content
type flag. Previously it was not preserved with requests that were not
of type json or form data.
# User-Facing Changes
* `http post`, `http put`, `http patch`, `http delete` can be used as
part of a pipeline
* `to text`, `to json`, `from json` all set the content_type metadata
field and the http commands will utilize them when making requests.
# Description
fixed#11678
The sub-commands of from command (`from {csv, tsv, ssv}`) name columns
starting from index 0.
This behaviour is inconsistent with other commands such as `detect
columns`.
This PR makes the subcommands index 0-based.
# User-Facing Changes
The subcommands (`from {csv, tsv, ssv}`) return a table with the columns
starting at index 0 if no header data is passed.
```
~/Development/nushell> "foo bar baz" | from ssv -n -m 1
╭───┬─────────┬─────────┬─────────╮
│ # │ column0 │ column1 │ column2 │
├───┼─────────┼─────────┼─────────┤
│ 0 │ foo │ bar │ baz │
╰───┴─────────┴─────────┴─────────╯
~/Development/nushell> "foo,bar,baz" | from csv -n
╭───┬─────────┬─────────┬─────────╮
│ # │ column0 │ column1 │ column2 │
├───┼─────────┼─────────┼─────────┤
│ 0 │ foo │ bar │ baz │
╰───┴─────────┴─────────┴─────────╯
~/Development/nushell> "foo\tbar\tbaz" | from tsv -n
╭───┬─────────┬─────────┬─────────╮
│ # │ column0 │ column1 │ column2 │
├───┼─────────┼─────────┼─────────┤
│ 0 │ foo │ bar │ baz │
╰───┴─────────┴─────────┴─────────╯
```
# Tests + Formatting
When I ran tests, `commands::touch::change_file_mtime_to_reference`
failed with the following error.
The error also occurs in the master branch, so it's probably unrelated
to these changes.
(maybe a problem with my dev environment)
```
$ toolkit check pr
~~~~~~~~
failures:
---- commands::touch::change_file_mtime_to_reference stdout ----
=== stderr
thread 'commands::touch::change_file_mtime_to_reference' panicked at crates/nu-command/tests/commands/touch.rs:298:9:
assertion `left == right` failed
left: SystemTime { tv_sec: 1719149697, tv_nsec: 57576929 }
right: SystemTime { tv_sec: 1719149697, tv_nsec: 78219489 }
failures:
commands::touch::change_file_mtime_to_reference
test result: FAILED. 1533 passed; 1 failed; 32 ignored; 0 measured; 0 filtered out; finished in 10.87s
error: test failed, to rerun pass `-p nu-command --test main`
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🔴 `toolkit test`
- ⚫ `toolkit test stdlib`
```
# After Submitting
nothing
# Description
Makes the `from json --objects` command produce a stream, and read
lazily from an input stream to produce its output.
Also added a helper, `PipelineData::get_type()`, to make it easier to
construct a wrong type error message when matching on `PipelineData`. I
expect checking `PipelineData` for either a string value or an `Unknown`
or `String` typed `ByteStream` will be very, very common. I would have
liked to have a helper that just returns a readable stream from either,
but that would either be a bespoke enum or a `Box<dyn BufRead>`, which
feels like it wouldn't be so great for performance. So instead, taking
the approach I did here is probably better - having a function that
accepts the `impl BufRead` and matching to use it.
# User-Facing Changes
- `from json --objects` no longer collects its input, and can be used
for large datasets or streams that produce values over time.
# Tests + Formatting
All passing.
# After Submitting
- [ ] release notes
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
# Description
Implements streaming for:
- `from csv`
- `from tsv`
- `to csv`
- `to tsv`
via the new string-typed ByteStream support.
# User-Facing Changes
Commands above. Also:
- `to csv` and `to tsv` now have `--columns <List(String)>`, to provide
the exact columns desired in the output. This is required for them to
have streaming output, because otherwise collecting the entire list is
necessary to determine the output columns. If we introduce
`TableStream`, this may become less necessary.
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] release notes
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
# Description
Changes `get_full_help` to take a `&dyn Command` instead of multiple
arguments (`&Signature`, `&Examples` `is_parser_keyword`). All of these
arguments can be gathered from a `Command`, so there is no need to pass
the pieces to `get_full_help`.
This PR also fixes an issue where the search terms are not shown if
`--help` is used on a command.
# Description
Kind of a vague title, but this PR does two main things:
1. Rather than overriding functions like `Command::is_parser_keyword`,
this PR instead changes commands to override `Command::command_type`.
The `CommandType` returned by `Command::command_type` is then used to
automatically determine whether `Command::is_parser_keyword` and the
other `is_{type}` functions should return true. These changes allow us
to remove the `CommandType::Other` case and should also guarantee than
only one of the `is_{type}` functions on `Command` will return true.
2. Uses the new, reworked `Command::command_type` function in the `scope
commands` and `which` commands.
# User-Facing Changes
- Breaking change for `scope commands`: multiple columns (`is_builtin`,
`is_keyword`, `is_plugin`, etc.) have been merged into the `type`
column.
- Breaking change: the `which` command can now report `plugin` or
`keyword` instead of `built-in` in the `type` column. It may also now
report `external` instead of `custom` in the `type` column for known
`extern`s.
# Description
This PR introduces a `ByteStream` type which is a `Read`-able stream of
bytes. Internally, it has an enum over three different byte stream
sources:
```rust
pub enum ByteStreamSource {
Read(Box<dyn Read + Send + 'static>),
File(File),
Child(ChildProcess),
}
```
This is in comparison to the current `RawStream` type, which is an
`Iterator<Item = Vec<u8>>` and has to allocate for each read chunk.
Currently, `PipelineData::ExternalStream` serves a weird dual role where
it is either external command output or a wrapper around `RawStream`.
`ByteStream` makes this distinction more clear (via `ByteStreamSource`)
and replaces `PipelineData::ExternalStream` in this PR:
```rust
pub enum PipelineData {
Empty,
Value(Value, Option<PipelineMetadata>),
ListStream(ListStream, Option<PipelineMetadata>),
ByteStream(ByteStream, Option<PipelineMetadata>),
}
```
The PR is relatively large, but a decent amount of it is just repetitive
changes.
This PR fixes#7017, fixes#10763, and fixes#12369.
This PR also improves performance when piping external commands. Nushell
should, in most cases, have competitive pipeline throughput compared to,
e.g., bash.
| Command | Before (MB/s) | After (MB/s) | Bash (MB/s) |
| -------------------------------------------------- | -------------:|
------------:| -----------:|
| `throughput \| rg 'x'` | 3059 | 3744 | 3739 |
| `throughput \| nu --testbin relay o> /dev/null` | 3508 | 8087 | 8136 |
# User-Facing Changes
- This is a breaking change for the plugin communication protocol,
because the `ExternalStreamInfo` was replaced with `ByteStreamInfo`.
Plugins now only have to deal with a single input stream, as opposed to
the previous three streams: stdout, stderr, and exit code.
- The output of `describe` has been changed for external/byte streams.
- Temporary breaking change: `bytes starts-with` no longer works with
byte streams. This is to keep the PR smaller, and `bytes ends-with`
already does not work on byte streams.
- If a process core dumped, then instead of having a `Value::Error` in
the `exit_code` column of the output returned from `complete`, it now is
a `Value::Int` with the negation of the signal number.
# After Submitting
- Update docs and book as necessary
- Release notes (e.g., plugin protocol changes)
- Adapt/convert commands to work with byte streams (high priority is
`str length`, `bytes starts-with`, and maybe `bytes ends-with`).
- Refactor the `tee` code, Devyn has already done some work on this.
---------
Co-authored-by: Devyn Cairns <devyn.cairns@gmail.com>
# Description
Does some misc changes to `ListStream`:
- Moves it into its own module/file separate from `RawStream`.
- `ListStream`s now have an associated `Span`.
- This required changes to `ListStreamInfo` in `nu-plugin`. Note sure if
this is a breaking change for the plugin protocol.
- Hides the internals of `ListStream` but also adds a few more methods.
- This includes two functions to more easily alter a stream (these take
a `ListStream` and return a `ListStream` instead of having to go through
the whole `into_pipeline_data(..)` route).
- `map`: takes a `FnMut(Value) -> Value`
- `modify`: takes a function to modify the inner stream.
# Description
Judiciously try to avoid allocations/clone by changing the signature of
functions
- **Don't pass str by value unnecessarily if only read**
- **Don't require a vec in `Sandbox::with_files`**
- **Remove unnecessary string clone**
- **Fixup unnecessary borrow**
- **Use `&str` in shape color instead**
- **Vec -> Slice**
- **Elide string clone**
- **Elide `Path` clone**
- **Take &str to elide clone in tests**
# User-Facing Changes
None
# Tests + Formatting
This touches many tests purely in changing from owned to borrowed/static
data
# Description
I thought about bringing `nu_plugin_msgpack` in, but that is MPL with a
clause that prevents other licenses, so rather than adapt that code I
decided to take a crack at just doing it straight from `rmp` to `Value`
without any `rmpv` in the middle. It seems like it's probably faster,
though I can't say for sure how much with the plugin overhead.
@IanManske I started on a `Read` implementation for `RawStream` but just
specialized to `from msgpack` here, but I'm thinking after release maybe
we can polish it up and make it a real one. It works!
# User-Facing Changes
New commands:
- `from msgpack`
- `from msgpackz`
- `to msgpack`
- `to msgpackz`
# Tests + Formatting
Pretty thorough tests added for the format deserialization, with a
roundtrip for serialization. Some example tests too for both `from
msgpack` and `to msgpack`.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] update release notes
# Description
Continuing from #12568, this PR further reduces the size of `Expr` from
64 to 40 bytes. It also reduces `Expression` from 128 to 96 bytes and
`Type` from 32 to 24 bytes.
This was accomplished by:
- for `Expr` with multiple fields (e.g., `Expr::Thing(A, B, C)`),
merging the fields into new AST struct types and then boxing this struct
(e.g. `Expr::Thing(Box<ABC>)`).
- replacing `Vec<T>` with `Box<[T]>` in multiple places. `Expr`s and
`Expression`s should rarely be mutated, if at all, so this optimization
makes sense.
By reducing the size of these types, I didn't notice a large performance
improvement (at least compared to #12568). But this PR does reduce the
memory usage of nushell. My config is somewhat light so I only noticed a
difference of 1.4MiB (38.9MiB vs 37.5MiB).
---------
Co-authored-by: Stefan Holderbach <sholderbach@users.noreply.github.com>
# Description
playing with the NUON format in Rust code in some plugins, we agreed
with the team it was a great time to create a standalone NUON format to
allow Rust devs to use this Nushell file format.
> **Note**
> this PR almost copy-pastes the code from
`nu_commands/src/formats/from/nuon.rs` and
`nu_commands/src/formats/to/nuon.rs` to `nuon/src/from.rs` and
`nuon/src/to.rs`, with minor tweaks to make then standalone functions,
e.g. remove the rest of the command implementations
### TODO
- [x] add tests
- [x] add documentation
# User-Facing Changes
devs will have access to a new crate, `nuon`, and two functions,
`from_nuon` and `to_nuon`
```rust
from_nuon(
input: &str,
span: Option<Span>,
) -> Result<Value, ShellError>
```
```rust
to_nuon(
input: &Value,
raw: bool,
tabs: Option<usize>,
indent: Option<usize>,
span: Option<Span>,
) -> Result<String, ShellError>
```
# Tests + Formatting
i've basically taken all the tests from
`crates/nu-command/tests/format_conversions/nuon.rs` and converted them
to use `from_nuon` and `to_nuon` instead of Nushell commands
- i've created a `nuon_end_to_end` to run both conversions with an
optional middle value to check that all is fine
> **Note**
> the `nuon::tests::read_code_should_fail_rather_than_panic` test does
give different results locally and in the CI...
> i've left it ignored with comments to help future us :)
# After Submitting
mention that in the release notes for sure!!
# Description
This PR adds a `ListItem` enum to our set of AST types. It encodes the
two possible expressions inside of list expression: a singular item or a
spread. This is similar to the existing `RecordItem` enum. Adding
`ListItem` allows us to remove the existing `Expr::Spread` case which
was previously used for list spreads. As a consequence, this guarantees
(via the type system) that spreads can only ever occur inside lists,
records, or as command args.
This PR also does a little bit of cleanup in relevant parser code.
# Description
Currently, `Range` is a struct with a `from`, `to`, and `incr` field,
which are all type `Value`. This PR changes `Range` to be an enum over
`IntRange` and `FloatRange` for better type safety / stronger compile
time guarantees.
Fixes: #11778Fixes: #11777Fixes: #11776Fixes: #11775Fixes: #11774Fixes: #11773Fixes: #11769.
# User-Facing Changes
Hopefully none, besides bug fixes.
Although, the `serde` representation might have changed.
# Description
When implementing a `Command`, one must also import all the types
present in the function signatures for `Command`. This makes it so that
we often import the same set of types in each command implementation
file. E.g., something like this:
```rust
use nu_protocol::ast::Call;
use nu_protocol::engine::{Command, EngineState, Stack};
use nu_protocol::{
record, Category, Example, IntoInterruptiblePipelineData, IntoPipelineData, PipelineData,
ShellError, Signature, Span, Type, Value,
};
```
This PR adds the `nu_engine::command_prelude` module which contains the
necessary and commonly used types to implement a `Command`:
```rust
// command_prelude.rs
pub use crate::CallExt;
pub use nu_protocol::{
ast::{Call, CellPath},
engine::{Command, EngineState, Stack},
record, Category, Example, IntoInterruptiblePipelineData, IntoPipelineData, IntoSpanned,
PipelineData, Record, ShellError, Signature, Span, Spanned, SyntaxShape, Type, Value,
};
```
This should reduce the boilerplate needed to implement a command and
also gives us a place to track the breadth of the `Command` API. I tried
to be conservative with what went into the prelude modules, since it
might be hard/annoying to remove items from the prelude in the future.
Let me know if something should be included or excluded.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Boxes `Record` inside `Value` to reduce memory usage, `Value` goes from
`72` -> `56` bytes after this change.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
This makes many of the larger objects in `EngineState` into `Arc`, and
uses `Arc::make_mut` to do clone-on-write if the reference is not
unique. This is generally very cheap, giving us the best of both worlds
- allowing us to mutate without cloning if we have an exclusive
reference, and cloning if we don't.
This started as more of a curiosity for me after remembering that
`Arc::make_mut` exists and can make using `Arc` for mostly immutable
data that sometimes needs to be changed very convenient, and also after
hearing someone complain about memory usage on Discord - this is a
somewhat significant win for that.
The exact objects that were wrapped in `Arc`:
- `files`, `file_contents` - the strings and byte buffers
- `decls` - the whole `Vec`, but mostly to avoid lots of individual
`malloc()` calls on Clone rather than for memory usage
- `blocks` - the blocks themselves, rather than the outer Vec
- `modules` - the modules themselves, rather than the outer Vec
- `env_vars`, `previous_env_vars` - the entire maps
- `config`
The changes required were relatively minimal, but this is a breaking API
change. In particular, blocks are added as Arcs, to allow the parser
cache functionality to work.
With my normal nu config, running on Linux, this saves me about 15 MiB
of process memory usage when running interactively (65 MiB → 50 MiB).
This also makes quick command executions cheaper, particularly since
every REPL loop now involves a clone of the engine state so that we can
recover from a panic. It also reduces memory usage where engine state
needs to be cloned and sent to another thread or kept within an
iterator.
# User-Facing Changes
Shouldn't be any, since it's all internal stuff, but it does change some
public interfaces so it's a breaking change
# Description
The PR overhauls how IO redirection is handled, allowing more explicit
and fine-grain control over `stdout` and `stderr` output as well as more
efficient IO and piping.
To summarize the changes in this PR:
- Added a new `IoStream` type to indicate the intended destination for a
pipeline element's `stdout` and `stderr`.
- The `stdout` and `stderr` `IoStream`s are stored in the `Stack` and to
avoid adding 6 additional arguments to every eval function and
`Command::run`. The `stdout` and `stderr` streams can be temporarily
overwritten through functions on `Stack` and these functions will return
a guard that restores the original `stdout` and `stderr` when dropped.
- In the AST, redirections are now directly part of a `PipelineElement`
as a `Option<Redirection>` field instead of having multiple different
`PipelineElement` enum variants for each kind of redirection. This
required changes to the parser, mainly in `lite_parser.rs`.
- `Command`s can also set a `IoStream` override/redirection which will
apply to the previous command in the pipeline. This is used, for
example, in `ignore` to allow the previous external command to have its
stdout redirected to `Stdio::null()` at spawn time. In contrast, the
current implementation has to create an os pipe and manually consume the
output on nushell's side. File and pipe redirections (`o>`, `e>`, `e>|`,
etc.) have precedence over overrides from commands.
This PR improves piping and IO speed, partially addressing #10763. Using
the `throughput` command from that issue, this PR gives the following
speedup on my setup for the commands below:
| Command | Before (MB/s) | After (MB/s) | Bash (MB/s) |
| --------------------------- | -------------:| ------------:|
-----------:|
| `throughput o> /dev/null` | 1169 | 52938 | 54305 |
| `throughput \| ignore` | 840 | 55438 | N/A |
| `throughput \| null` | Error | 53617 | N/A |
| `throughput \| rg 'x'` | 1165 | 3049 | 3736 |
| `(throughput) \| rg 'x'` | 810 | 3085 | 3815 |
(Numbers above are the median samples for throughput)
This PR also paves the way to refactor our `ExternalStream` handling in
the various commands. For example, this PR already fixes the following
code:
```nushell
^sh -c 'echo -n "hello "; sleep 0; echo "world"' | find "hello world"
```
This returns an empty list on 0.90.1 and returns a highlighted "hello
world" on this PR.
Since the `stdout` and `stderr` `IoStream`s are available to commands
when they are run, then this unlocks the potential for more convenient
behavior. E.g., the `find` command can disable its ansi highlighting if
it detects that the output `IoStream` is not the terminal. Knowing the
output streams will also allow background job output to be redirected
more easily and efficiently.
# User-Facing Changes
- External commands returned from closures will be collected (in most
cases):
```nushell
1..2 | each {|_| nu -c "print a" }
```
This gives `["a", "a"]` on this PR, whereas this used to print "a\na\n"
and then return an empty list.
```nushell
1..2 | each {|_| nu -c "print -e a" }
```
This gives `["", ""]` and prints "a\na\n" to stderr, whereas this used
to return an empty list and print "a\na\n" to stderr.
- Trailing new lines are always trimmed for external commands when
piping into internal commands or collecting it as a value. (Failure to
decode the output as utf-8 will keep the trailing newline for the last
binary value.) In the current nushell version, the following three code
snippets differ only in parenthesis placement, but they all also have
different outputs:
1. `1..2 | each { ^echo a }`
```
a
a
╭────────────╮
│ empty list │
╰────────────╯
```
2. `1..2 | each { (^echo a) }`
```
╭───┬───╮
│ 0 │ a │
│ 1 │ a │
╰───┴───╯
```
3. `1..2 | (each { ^echo a })`
```
╭───┬───╮
│ 0 │ a │
│ │ │
│ 1 │ a │
│ │ │
╰───┴───╯
```
But in this PR, the above snippets will all have the same output:
```
╭───┬───╮
│ 0 │ a │
│ 1 │ a │
╰───┴───╯
```
- All existing flags on `run-external` are now deprecated.
- File redirections now apply to all commands inside a code block:
```nushell
(nu -c "print -e a"; nu -c "print -e b") e> test.out
```
This gives "a\nb\n" in `test.out` and prints nothing. The same result
would happen when printing to stdout and using a `o>` file redirection.
- External command output will (almost) never be ignored, and ignoring
output must be explicit now:
```nushell
(^echo a; ^echo b)
```
This prints "a\nb\n", whereas this used to print only "b\n". This only
applies to external commands; values and internal commands not in return
position will not print anything (e.g., `(echo a; echo b)` still only
prints "b").
- `complete` now always captures stderr (`do` is not necessary).
# After Submitting
The language guide and other documentation will need to be updated.
- Fixes#11997
# Description
Fixes the issue that comments are not ignored in SSV formatted data.

# User-Facing Changes
If you have a comment in the beginning of SSV formatted data it is now
not included in the SSV table.
# Tests + Formatting
The PR adds one test in the ssv.rs file. All previous test-cases are
still passing. Clippy and Fmt have been ran.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
This PR add date support when using the `open` command on a xlsx file,
and the using `from xlsx` on a xlsx file.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Currently dates in xlsx files are read as nulls, after this PR this
would be regular dates.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
This PR renames the conversion functions on `Value` to be more consistent.
It follows the Rust [API guidelines](https://rust-lang.github.io/api-guidelines/naming.html#ad-hoc-conversions-follow-as_-to_-into_-conventions-c-conv) for ad-hoc conversions.
The conversion functions on `Value` now come in a few forms:
- `coerce_{type}` takes a `&Value` and attempts to convert the value to
`type` (e.g., `i64` are converted to `f64`). This is the old behavior of
some of the `as_{type}` functions -- these functions have simply been
renamed to better reflect what they do.
- The new `as_{type}` functions take a `&Value` and returns an `Ok`
result only if the value is of `type` (no conversion is attempted). The
returned value will be borrowed if `type` is non-`Copy`, otherwise an
owned value is returned.
- `into_{type}` exists for non-`Copy` types, but otherwise does not
attempt conversion just like `as_type`. It takes an owned `Value` and
always returns an owned result.
- `coerce_into_{type}` has the same relationship with `coerce_{type}` as
`into_{type}` does with `as_{type}`.
- `to_{kind}_string`: conversion to different string formats (debug,
abbreviated, etc.). Only two of the old string conversion functions were
removed, the rest have been renamed only.
- `to_{type}`: other conversion functions. Currently, only `to_path`
exists. (And `to_string` through `Display`.)
This table summaries the above:
| Form | Cost | Input Ownership | Output Ownership | Converts `Value`
case/`type` |
| ---------------------------- | ----- | --------------- |
---------------- | -------- |
| `as_{type}` | Cheap | Borrowed | Borrowed/Owned | No |
| `into_{type}` | Cheap | Owned | Owned | No |
| `coerce_{type}` | Cheap | Borrowed | Borrowed/Owned | Yes |
| `coerce_into_{type}` | Cheap | Owned | Owned | Yes |
| `to_{kind}_string` | Expensive | Borrowed | Owned | Yes |
| `to_{type}` | Expensive | Borrowed | Owned | Yes |
# User-Facing Changes
Breaking API change for `Value` in `nu-protocol` which is exposed as
part of the plugin API.
With this PR i try to resolve#11751
# Description
I am rather new to Rust so if anything is not the way it should be
please let me know.
As described in the title I just fixed the date conversion in the to and
from toml commands as i thought it would be a good first issue. The
example of the original issue will now work as follows:
```
~> {date: 2024-02-02} | to toml
date = "2024-02-02T00:00:00+00:00"
~> "dob = 1979-05-27T07:32:00-08:00" | from toml
╭─────┬───────────────────────────╮
│ dob │ 44 years ago |
╰─────┴───────────────────────────╯
```
The `from toml` command now returns a nushell date which is displayed as
`44 years ago` in this case.
# User-Facing Changes
none
# Tests + Formatting
all tests pass and formatting has been applied
---------
Co-authored-by: dannou812 <dannou281@gmail.com>
Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
# Description
Close: #9673Close: #8277Close: #10944
This pr introduces the following syntax:
1. `e>|`, pipe stderr to next command. Example: `$env.FOO=bar nu
--testbin echo_env_stderr FOO e>| str length`
2. `o+e>|` and `e+o>|`, pipe both stdout and stderr to next command,
example: `$env.FOO=bar nu --testbin echo_env_mixed out-err FOO FOO e+o>|
str length`
Note: it only works for external commands. ~There is no different for
internal commands, that is, the following three commands do the same
things:~ Edit: it raises errors if we want to pipes for internal
commands
```
❯ ls e>| str length
Error: × `e>|` only works with external streams
╭─[entry #1:1:1]
1 │ ls e>| str length
· ─┬─
· ╰── `e>|` only works on external streams
╰────
❯ ls e+o>| str length
Error: × `o+e>|` only works with external streams
╭─[entry #2:1:1]
1 │ ls e+o>| str length
· ──┬──
· ╰── `o+e>|` only works on external streams
╰────
```
This can help us to avoid some strange issues like the following:
`$env.FOO=bar (nu --testbin echo_env_stderr FOO) e>| str length`
Which is hard to understand and hard to explain to users.
# User-Facing Changes
Nan
# Tests + Formatting
To be done
# After Submitting
Maybe update documentation about these syntax.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Fixes https://github.com/nushell/nushell/issues/11716
The problem is in our [record creation
API](0d518bf813/crates/nu-protocol/src/value/record.rs (L33))
which panics if the numbers of columns and values are different. I added
a safe variant that returns a `Result` and used it in the `rotate`
command.
## TODO in another PR:
Go through all `from_raw_cols_vals_unchecked()` (this includes the
`record!` macro which uses the unchecked version) and make sure that
either
a) it is guaranteed the number of cols and vals is the same, or
b) convert the call to `from_raw_cols_vals()`
Reason: Nushell should never panic.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Adds the `--strict` flag for `from json` which will try to parse text
while following the exact JSON specification (e.g., no comments or
trailing commas allowed). Fixes issue #11548.
# Description
This pr is a follow up to
[#11569](https://github.com/nushell/nushell/pull/11569#issuecomment-1902279587)
> Revert the logic in https://github.com/nushell/nushell/pull/10694 and
apply the logic in this pr to mv, cp, rv will require a larger change, I
need to think how to achieve the bahavior
And sorry @bobhy for reverting some of your changes.
This pr is going to unify glob behavior on the given commands:
* open
* rm
* cp-old
* mv
* umv
* cp
* du
So they have the same behavior to `ls`, which is:
If given parameter is quoted by single quote(`'`) or double quote(`"`),
don't auto-expand the glob pattern. If not quoted, auto-expand the glob
pattern.
Fixes: #9558Fixes: #10211Fixes: #9310Fixes: #10364
# TODO
But there is one thing remains: if we give a variable to the command, it
will always auto-expand the glob pattern, e.g:
```nushell
let path = "a[123]b"
rm $path
```
I don't think it's expected. But I also think user might want to
auto-expand the glob pattern in variables.
So I'll introduce a new command called `glob escape`, then if user
doesn't want to auto-expand the glob pattern, he can just do this: `rm
($path | glob escape)`
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
Done
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
## NOTE
This pr changes the semantic of `GlobPattern`, before this pr, it will
`expand path` after evaluated, this makes `nu_engine::glob_from` have no
chance to glob things right if a path contains glob pattern.
e.g: [#9310
](https://github.com/nushell/nushell/issues/9310#issuecomment-1886824030)
#10211
I think changing the semantic is fine, because it makes glob works if
path contains something like '*'.
It maybe a breaking change if a custom command's argument are annotated
by `: glob`.
# Description
Fixes: #11455
### For arguments which is annotated with `:path/:directory/:glob`
To fix the issue, we need to have a way to know if a path is originally
quoted during runtime. So the information needed to be added at several
levels:
* parse time (from user input to expression)
We need to add quoted information into `Expr::Filepath`,
`Expr::Directory`, `Expr::GlobPattern`
* eval time
When convert from `Expr::Filepath`, `Expr::Directory`,
`Expr::GlobPattern` to `Value::String` during runtime, we won't auto
expanded the path if it's quoted
### For `ls`
It's really special, because it accepts a `String` as a pattern, and it
generates `glob` expression inside the command itself.
So the idea behind the change is introducing a special SyntaxShape to
ls: `SyntaxShape::LsGlobPattern`. So we can track if the pattern is
originally quoted easier, and we don't auto expand the path either.
Then when constructing a glob pattern inside ls, we check if input
pattern is quoted, if so: we escape the input pattern, so we can run `ls
a[123]b`, because it's already escaped.
Finally, to accomplish the checking process, we also need to introduce a
new value type called `Value::QuotedString` to differ from
`Value::String`, it's used to generate an enum called `NuPath`, which is
finally used in `ls` function. `ls` learned from `NuPath` to know if
user input is quoted.
# User-Facing Changes
Actually it contains several changes
### For arguments which is annotated with `:path/:directory/:glob`
#### Before
```nushell
> def foo [p: path] { echo $p }; print (foo "~/a"); print (foo '~/a')
/home/windsoilder/a
/home/windsoilder/a
> def foo [p: directory] { echo $p }; print (foo "~/a"); print (foo '~/a')
/home/windsoilder/a
/home/windsoilder/a
> def foo [p: glob] { echo $p }; print (foo "~/a"); print (foo '~/a')
/home/windsoilder/a
/home/windsoilder/a
```
#### After
```nushell
> def foo [p: path] { echo $p }; print (foo "~/a"); print (foo '~/a')
~/a
~/a
> def foo [p: directory] { echo $p }; print (foo "~/a"); print (foo '~/a')
~/a
~/a
> def foo [p: glob] { echo $p }; print (foo "~/a"); print (foo '~/a')
~/a
~/a
```
### For ls command
`touch '[uwu]'`
#### Before
```
❯ ls -D "[uwu]"
Error: × No matches found for [uwu]
╭─[entry #6:1:1]
1 │ ls -D "[uwu]"
· ───┬───
· ╰── Pattern, file or folder not found
╰────
help: no matches found
```
#### After
```
❯ ls -D "[uwu]"
╭───┬───────┬──────┬──────┬──────────╮
│ # │ name │ type │ size │ modified │
├───┼───────┼──────┼──────┼──────────┤
│ 0 │ [uwu] │ file │ 0 B │ now │
╰───┴───────┴──────┴──────┴──────────╯
```
# Tests + Formatting
Done
# After Submitting
NaN
# Description
This PR updates a few outdated dependencies.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Constructing the internals of `Record` without checking the lengths is
bad. (also incompatible with changes to how we store records)
- Use `Record::from_raw_cols_vals` in dataframe code
- Use `record!` macro in dataframe test
- Use `record!` in `nu-color-config` tests
- Stop direct record construction in `nu-command`
- Refactor table construction in `from nuon`
# User-Facing Changes
None
# Tests + Formatting
No new tests, updated tests in equal fashion
# Description
Following from #11356, it looks like `Expr::MatchPattern` is no longer
used in any way. This PR removes `Expr::MatchPattern` alongside
`Type::MatchPattern` and `SyntaxShape::MatchPattern`.
# User-Facing Changes
Breaking API change for `nu_protocol`.
# Description
Replace `.to_string()` used in `GenericError` with `.into()` as
`.into()` seems more popular
Replace `Vec::new()` used in `GenericError` with `vec![]` as `vec![]`
seems more popular
(There are so, so many)
Goes towards implementing #10598, which asks for a spread operator in
lists, in records, and when calling commands (continuation of #11006,
which only implements it in lists)
# Description
This PR is for adding a spread operator that can be used when building
records. Additional functionality can be added later.
Changes:
- Previously, the `Expr::Record` variant held `(Expression, Expression)`
pairs. It now holds instances of an enum `RecordItem` (the name isn't
amazing) that allows either a key-value mapping or a spread operator.
- `...` will be treated as the spread operator when it appears before
`$`, `{`, or `(` inside records (no whitespace allowed in between) (not
implemented yet)
- The error message for duplicate columns now includes the column name
itself, because if two spread records are involved in such an error, you
can't tell which field was duplicated from the spans alone
`...` will still be treated as a normal string outside records, and even
in records, it is not treated as a spread operator when not followed
immediately by a `$`, `{`, or `(`.
# User-Facing Changes
Users will be able to use `...` when building records.
```
> let rec = { x: 1, ...{ a: 2 } }
> $rec
╭───┬───╮
│ x │ 1 │
│ a │ 2 │
╰───┴───╯
> { foo: bar, ...$rec, baz: blah }
╭─────┬──────╮
│ foo │ bar │
│ x │ 1 │
│ a │ 2 │
│ baz │ blah │
╰─────┴──────╯
```
If you want to update a field of a record, you'll have to use `merge`
instead:
```
> { ...$rec, x: 5 }
Error: nu:🐚:column_defined_twice
× Record field or table column used twice: x
╭─[entry #2:1:1]
1 │ { ...$rec, x: 5 }
· ──┬─ ┬
· │ ╰── field redefined here
· ╰── field first defined here
╰────
> $rec | merge { x: 5 }
╭───┬───╮
│ x │ 5 │
│ a │ 2 │
╰───┴───╯
```
# Tests + Formatting
# After Submitting
# Description
Close: #10278
This pr introduces `o>>`, `e>>`, `o+e>>` to allow redirection to append
to a file.
Examples:
```nushell
echo abc o>> a.txt
echo abc o>> a.txt
cat asdf e>> a.txt
cat asdf e>> a.txt
cat asdf o+e>> a.txt
```
~~TODO:~~
~~1. currently internal commands with `o+e>` redirect to a variable is
broken: `let x = "a.txt"; echo abc o+e> $x`, not sure when it was
introduced...~~
~~2. redirect stdout and stderr with append mode doesn't supported yet:
`cat asdf o>>a.txt e>>b.ext`~~
~~For these 2 items, I'd like to fix them in different prs.~~
Already done in this pr
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
Added statement catching early List passed to CSV and printing more
helpful error message. This fixes#10081. Similar message might be
useful for other from_* calls but I'm not sure if there aren't any
converters accepting List as input.
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Further work towards the goal that we can make `Record`'s field private
and experiment with different internal representations
## Details
- Use inplace record iter in `nu-command/math/utils`
- Guarantee that existing allocation can be reused
- Use proper record iterators in `path join`
- Remove unnecesary hashmap in `path join`
- Should minimally reduce the overhead
- Unzip records in `nu-command`
- Refactor `query web` plugin to use record APIs
- Use `Record::into_values` for `values` command
- Use `Record::columns()` in `join` instead.
- Potential minor pessimisation
- Not the hot value path
- Use sane `Record` iters in example `Debug` impl
- Avoid layout assumption in `nu-cmd-extra/roll/mod`
- Potential minor pessimisation
- relegated to `extra`, changing the representation may otherwise break
this op.
- Use record api in `rotate`
- Minor risk that this surfaces some existing invalid behavior as panics
as we now validate column/value lengths
- `extra` so things are unstable
- Remove unnecessary references in `rotate`
- Bonus cleanup
# User-Facing Changes
None functional, minor potential differences in runtime. You win some,
you lose some.
# Tests + Formatting
Relying on existing tests