# Description
Fix#13021
This changes the `expand_glob()` function to use
`nu_engine::glob_from()` so that absolute paths are actually preserved,
rather than being made relative to the provided parent. This preserves
the intent of whoever wrote the original path/glob, and also makes it so
that tilde always produces absolute paths.
I also made `expand_glob()` handle Ctrl-C so that it can be interrupted.
cc @YizhePKU
# Tests + Formatting
No additional tests here... but that might be a good idea.
# Description
Fixes#13016 and adds tests for many variations of external call
parsing.
I just realized @kubouch took a crack at this too (#13022) so really
whichever is better, but I think the
tests are a good addition.
# Description
Makes `run-external` error if arguments to `cmd.exe` internal commands
contain newlines or a percent sign. This is because the percent sign can
expand environment variables, potentially? allowing command injection.
Newlines I think will truncate the rest of the arguments and should
probably be disallowed to be safe.
# After Submitting
- If the user calls `cmd.exe` directly, then this bypasses our
handling/checking for internal `cmd` commands. Instead, we use the
handling from the Rust std lib which, in this case, does not do special
handling and is potentially unsafe. Then again, it could be the user's
specific intention to run `cmd` with whatever trusted input. The problem
is that since we use the std lib handling, it assumes the exe uses the C
runtime escaping rules and will perform some unwanted escaping. E.g., it
will add backslashes to the quotes in `cmd echo /c '""'`.
- If `cmd` is called indirectly via a `.bat` or `.cmd` file, then we use
the Rust std lib which has separate handling for bat files that should
be safe, but will reject some inputs.
- ~~I'm not sure how we handle `PATHEXT`, that can also cause a file
without an extension to be run as a bat file. If so, I don't know where
the handling, if any, is done for that.~~ It looks like we use the
`which` crate to do the lookup using `PATHEXT`. Then, we pass the exe
path from that to the Rust std lib `Command`, which should be safe
(except for the first `cmd.exe` note).
So, in the future we need to unify and/or fix these different
implementations, including our own special handling for internal `cmd`
commands that this PR tries to fix.
# Description
Fix a regression introduced by #12921, where tilde expansion was no
longer done on the external command name, breaking things like
```nushell
> ~/.cargo/bin/exa
```
This properly handles quoted strings, so they don't expand:
```nushell
> ^"~/.cargo/bin/exa"
Error: nu:🐚:external_command
× External command failed
╭─[entry #1:1:2]
1 │ ^"~/.cargo/bin/exa"
· ─────────┬────────
· ╰── Command `~/.cargo/bin/exa` not found
╰────
help: `~/.cargo/bin/exa` is neither a Nushell built-in or a known external command
```
This required a change to the parser, so the command name is also parsed
in the same way the arguments are - i.e. the quotes on the outside
remain in the expression. Hopefully that doesn't break anything else. 🤞Fixes#13000. Should include in patch release 0.94.1
cc @YizhePKU
# User-Facing Changes
- Tilde expansion now works again for external commands
- The `command` of `run-external` will now have its quotes removed like
the other arguments if it is a literal string
- The parser is changed to include quotes in the command expression of
`ExternalCall` if they were present
# Tests + Formatting
I would like to add a regression test for this, but it's complicated
because we need a well-known binary within the home directory, which
just isn't a thing. We could drop one there, but that's kind of a bad
behavior for a test to do. I also considered changing the home directory
for the test, but that's so platform-specific - potentially could get it
working on specific platforms though. Changing `HOME` env on Linux
definitely works as far as tilde expansion works.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# Description
This fixes a bug in the `OSC 9;9` functionality where the path wasn't
being constructed properly and therefore wasn't getting set right for
things like "Duplicate Tab" in Windows Terminal. Thanks to @Araxeus for
finding it.
Related to https://github.com/nushell/nushell/issues/10166
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
This PR fixes the `path type` command so that it resolves relative paths
using PWD from the engine state.
As a bonus, it also fixes the issue of `path type` returning an empty
string instead of an error when it fails.
This PR fixes a bug where `.` is expanded into an empty string when used
as an argument to external commands. Fixes
https://github.com/nushell/nushell/issues/12948.
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
# Description
@maxim-uvarov did a ton of research and work with the dply-rs author and
ritchie from polars and found out that the allocator matters on macos
and it seems to be what was messing up the performance of polars plugin.
ritchie suggested to use jemalloc but i switched it to mimalloc to match
nushell and it seems to run better.
## Before (default allocator)
note - using 1..10 vs 1..100 since it takes so long. also notice how
high the `max` timings are compared to mimalloc below.
```nushell
❯ 1..10 | each {timeit {polars open Data7602DescendingYearOrder.csv | polars group-by year | polars agg (polars col geo_count | polars sum) | polars collect | null}} | | {mean: ($in | math avg), min: ($in | math min), max: ($in | math max), stddev: ($in | into int | into float | math stddev | into int | $'($in)ns' | into duration)}
╭────────┬─────────────────────────╮
│ mean │ 4sec 999ms 605µs 995ns │
│ min │ 983ms 627µs 42ns │
│ max │ 13sec 398ms 135µs 791ns │
│ stddev │ 3sec 476ms 479µs 939ns │
╰────────┴─────────────────────────╯
❯ use std bench
❯ bench { polars open Data7602DescendingYearOrder.csv | polars group-by year | polars agg (polars col geo_count | polars sum) | polars collect | null } -n 10
╭───────┬────────────────────────╮
│ mean │ 6sec 220ms 783µs 983ns │
│ min │ 1sec 184ms 997µs 708ns │
│ max │ 18sec 882ms 81µs 708ns │
│ std │ 5sec 350ms 375µs 697ns │
│ times │ [list 10 items] │
╰───────┴────────────────────────╯
```
## After (using mimalloc)
```nushell
❯ 1..100 | each {timeit {polars open Data7602DescendingYearOrder.csv | polars group-by year | polars agg (polars col geo_count | polars sum) | polars collect | null}} | | {mean: ($in | math avg), min: ($in | math min), max: ($in | math max), stddev: ($in | into int | into float | math stddev | into int | $'($in)ns' | into duration)}
╭────────┬───────────────────╮
│ mean │ 103ms 728µs 902ns │
│ min │ 97ms 107µs 42ns │
│ max │ 149ms 430µs 84ns │
│ stddev │ 5ms 690µs 664ns │
╰────────┴───────────────────╯
❯ use std bench
❯ bench { polars open Data7602DescendingYearOrder.csv | polars group-by year | polars agg (polars col geo_count | polars sum) | polars collect | null } -n 100
╭───────┬───────────────────╮
│ mean │ 103ms 620µs 195ns │
│ min │ 97ms 541µs 166ns │
│ max │ 130ms 262µs 166ns │
│ std │ 4ms 948µs 654ns │
│ times │ [list 100 items] │
╰───────┴───────────────────╯
```
## After (using jemalloc - just for comparison)
```nushell
❯ 1..100 | each {timeit {polars open Data7602DescendingYearOrder.csv | polars group-by year | polars agg (polars col geo_count | polars sum) | polars collect | null}} | | {mean: ($in | math avg), min: ($in | math min), max: ($in | math max), stddev: ($in | into int | into float | math stddev | into int | $'($in)ns' | into duration)}
╭────────┬───────────────────╮
│ mean │ 113ms 939µs 777ns │
│ min │ 108ms 337µs 333ns │
│ max │ 166ms 467µs 458ns │
│ stddev │ 6ms 175µs 618ns │
╰────────┴───────────────────╯
❯ use std bench
❯ bench { polars open Data7602DescendingYearOrder.csv | polars group-by year | polars agg (polars col geo_count | polars sum) | polars collect | null } -n 100
╭───────┬───────────────────╮
│ mean │ 114ms 363µs 530ns │
│ min │ 108ms 804µs 833ns │
│ max │ 143ms 521µs 459ns │
│ std │ 5ms 88µs 56ns │
│ times │ [list 100 items] │
╰───────┴───────────────────╯
```
## After (using parquet + mimalloc)
```nushell
❯ 1..100 | each {timeit {polars open data.parquet | polars group-by year | polars agg (polars col geo_count | polars sum) | polars collect | null}} | | {mean: ($in | math avg), min: ($in | math min), max: ($in | math max), stddev: ($in | into int | into float | math stddev | into int | $'($in)ns' | into duration)}
╭────────┬──────────────────╮
│ mean │ 34ms 255µs 492ns │
│ min │ 31ms 787µs 250ns │
│ max │ 76ms 408µs 416ns │
│ stddev │ 4ms 472µs 916ns │
╰────────┴──────────────────╯
❯ use std bench
❯ bench { polars open data.parquet | polars group-by year | polars agg (polars col geo_count | polars sum) | polars collect | null } -n 100
╭───────┬──────────────────╮
│ mean │ 34ms 897µs 562ns │
│ min │ 31ms 518µs 542ns │
│ max │ 65ms 943µs 625ns │
│ std │ 3ms 450µs 741ns │
│ times │ [list 100 items] │
╰───────┴──────────────────╯
```
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
Instead of returning an error, this PR changes `expand_glob` in
`run_external.rs` to return the original string arg if glob creation
failed. This makes it so that, e.g.,
```nushell
^echo `[`
^echo `***`
```
no longer fail with a shell error. (This follows from #12921.)
# Description
Currently, this pipeline doesn't work `open --raw file | take 100`,
since the type of the byte stream is `Unknown`, but `take` expects
`Binary` streams. This PR changes commands that expect
`ByteStreamType::Binary` to also work with `ByteStreamType::Unknown`.
This was done by adding two new methods to `ByteStreamType`:
`is_binary_coercible` and `is_string_coercible`. These return true if
the type is `Unknown` or matches the type in the method name.
# Description
Makes the `from json --objects` command produce a stream, and read
lazily from an input stream to produce its output.
Also added a helper, `PipelineData::get_type()`, to make it easier to
construct a wrong type error message when matching on `PipelineData`. I
expect checking `PipelineData` for either a string value or an `Unknown`
or `String` typed `ByteStream` will be very, very common. I would have
liked to have a helper that just returns a readable stream from either,
but that would either be a bespoke enum or a `Box<dyn BufRead>`, which
feels like it wouldn't be so great for performance. So instead, taking
the approach I did here is probably better - having a function that
accepts the `impl BufRead` and matching to use it.
# User-Facing Changes
- `from json --objects` no longer collects its input, and can be used
for large datasets or streams that produce values over time.
# Tests + Formatting
All passing.
# After Submitting
- [ ] release notes
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
This reverts commit 68adc4657f.
# Description
Reverts the lazyframe refactor (#12669) for the next release, since
there are still a few lingering issues. This temporarily solves #12863
and #12828. After the release, the lazyframes can be added back and
cleaned up.
Another very boring PR cleaning up and documenting some of `explore`'s
innards. Mostly renaming things that I found confusing or vague when
reading through the code, also adding some comments.
# Description
Fixes: #12941
~~The issue is cause by some columns(is_builtin, is_plugin, is_custom,
is_keyword) are removed in #10023~~
Edit: I'm wrong
# Tests + Formatting
Added one test for `std help`
# Description
Following from #12523, this PR removes support for lists of environments
variables in the `with-env` command. Rather, only records will be
supported now.
# After Submitting
Update examples using the list form in the docs and book.
Small change, removing 4 more configuration options from `explore`'s
binary viewer:
1. `show_index`
2. `show_data`
3. `show_ascii`
4. `show_split`
These controlled whether the 3 columns in the binary viewer (index, hex
data, ASCII) and the pipe separator (`|`) in between them are shown. I
don't think we need this level of configurability until the `explore`
command is more mature, and maybe even not then; we can just show them
all.
I think it's very unlikely that anyone is using these configuration
points.
Also, the row offset (e.g. how many rows we have scrolled down) was
being stored in config/settings when it's arguably not config; more like
internal state of the binary viewer. I moved it to a more appropriate
location and renamed it.
# Description
```nushell
❯ ls
╭───┬───────┬──────┬──────┬──────────╮
│ # │ name │ type │ size │ modified │
├───┼───────┼──────┼──────┼──────────┤
│ 0 │ a.txt │ file │ 0 B │ now │
╰───┴───────┴──────┴──────┴──────────╯
❯ ls a.
NO RECORDS FOUND
```
There is a completion issue on previous version, I think @amtoine have
reproduced it before. But currently I can't reproduce it on latest main.
To avoid such regression, I added some tests for completion.
---------
Co-authored-by: Antoine Stevan <44101798+amtoine@users.noreply.github.com>
# Description
Fixes: https://github.com/nushell/nushell/issues/7761
It's still unsure if we want to change the `range semantic` itself, but
it's good to keep range semantic consistent between nushell commands.
# User-Facing Changes
### Before
```nushell
❯ "abc" | str substring 1..=2
b
```
### After
```nushell
❯ "abc" | str substring 1..=2
bc
```
# Tests + Formatting
Adjust tests to fit new behavior
As discussed in https://github.com/nushell/nushell/pull/12749, we no
longer need to call `std::env::set_current_dir()` to sync `$env.PWD`
with the actual working directory. This PR removes the call from
`EngineState::merge_env()`.
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx
you can also mention related issues, PRs or discussions!
-->
# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.
Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
1. With the `-l` flag, `debug profile` now collects files and line
numbers of profiled pipeline elements
![profiler_lines](https://github.com/nushell/nushell/assets/25571562/b400a956-d958-4aff-aa4c-7e65da3f78fa)
2. Error from the profiled closure will be reported instead of silently
ignored.
![profiler_lines_error](https://github.com/nushell/nushell/assets/25571562/54f7ad7a-06a3-4d56-92c2-c3466917bee8)
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
New `--lines(-l)` flag to `debug profile`. The command will also fail if
the profiled closure fails, so technically it is a breaking change.
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
# Description
Implements streaming for:
- `from csv`
- `from tsv`
- `to csv`
- `to tsv`
via the new string-typed ByteStream support.
# User-Facing Changes
Commands above. Also:
- `to csv` and `to tsv` now have `--columns <List(String)>`, to provide
the exact columns desired in the output. This is required for them to
have streaming output, because otherwise collecting the entire list is
necessary to determine the output columns. If we introduce
`TableStream`, this may become less necessary.
# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
# After Submitting
- [ ] release notes
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
# Description
I feel like it's a little sad that BSDs get to enjoy almost everything
other than the `ps` command, and there are some tests that rely on this
command, so I figured it would be fun to patch that and make it work.
The different BSDs have diverged from each other somewhat, but generally
have a similar enough API for reading process information via
`sysctl()`, with some slightly different args.
This supports FreeBSD with the `freebsd` module, and NetBSD and OpenBSD
with the `netbsd` module. OpenBSD is a fork of NetBSD and the interface
has some minor differences but many things are the same.
I had wanted to try to support DragonFlyBSD too, but their Rust version
in the latest release is only 1.72.0, which is too old for me to want to
try to compile rustc up to 1.77.2... but I will revisit this whenever
they do update it. Dragonfly is a fork of FreeBSD, so it's likely to be
more or less the same - I just don't want to enable it without testing
it.
Fixes#6862 (partially, we probably won't be adding `zfs list`)
# User-Facing Changes
`ps` added for FreeBSD, NetBSD, and OpenBSD.
# Tests + Formatting
The CI doesn't run tests for BSDs, so I'm not entirely sure if
everything was already passing before. (Frankly, it's unlikely.) But
nothing appears to be broken.
# After Submitting
- [ ] release notes?
- [ ] DragonflyBSD, whenever they do update Rust to something close
enough for me to try it
Bumps [shadow-rs](https://github.com/baoyachi/shadow-rs) from 0.27.1 to
0.28.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/baoyachi/shadow-rs/releases">shadow-rs's
releases</a>.</em></p>
<blockquote>
<h2>fix cargo clippy</h2>
<p><a
href="https://redirect.github.com/baoyachi/shadow-rs/issues/160">#160</a></p>
<p>Thx <a href="https://github.com/qartik"><code>@qartik</code></a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="ba9f8b0c2b"><code>ba9f8b0</code></a>
Update Cargo.toml</li>
<li><a
href="d1b724c1e7"><code>d1b724c</code></a>
Merge pull request <a
href="https://redirect.github.com/baoyachi/shadow-rs/issues/160">#160</a>
from qartik/patch-1</li>
<li><a
href="505108d5d6"><code>505108d</code></a>
Allow missing_docs for deprecated CLAP_VERSION constant</li>
<li>See full diff in <a
href="https://github.com/baoyachi/shadow-rs/compare/v0.27.1...v0.28.0">compare
view</a></li>
</ul>
</details>
<br />
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=shadow-rs&package-manager=cargo&previous-version=0.27.1&new-version=0.28.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.5
to 4.1.6.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/checkout/releases">actions/checkout's
releases</a>.</em></p>
<blockquote>
<h2>v4.1.6</h2>
<h2>What's Changed</h2>
<ul>
<li>Check platform to set archive extension appropriately by <a
href="https://github.com/cory-miller"><code>@cory-miller</code></a> in
<a
href="https://redirect.github.com/actions/checkout/pull/1732">actions/checkout#1732</a></li>
<li>Update for 4.1.6 release by <a
href="https://github.com/cory-miller"><code>@cory-miller</code></a> in
<a
href="https://redirect.github.com/actions/checkout/pull/1733">actions/checkout#1733</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/checkout/compare/v4.1.5...v4.1.6">https://github.com/actions/checkout/compare/v4.1.5...v4.1.6</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/actions/checkout/blob/main/CHANGELOG.md">actions/checkout's
changelog</a>.</em></p>
<blockquote>
<h2>v4.1.6</h2>
<ul>
<li>Check platform to set archive extension appropriately by <a
href="https://github.com/cory-miller"><code>@cory-miller</code></a> in
<a
href="https://redirect.github.com/actions/checkout/pull/1732">actions/checkout#1732</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a5ac7e51b4"><code>a5ac7e5</code></a>
Update for 4.1.6 release (<a
href="https://redirect.github.com/actions/checkout/issues/1733">#1733</a>)</li>
<li><a
href="24ed1a3528"><code>24ed1a3</code></a>
Check platform for extension (<a
href="https://redirect.github.com/actions/checkout/issues/1732">#1732</a>)</li>
<li>See full diff in <a
href="https://github.com/actions/checkout/compare/v4.1.5...v4.1.6">compare
view</a></li>
</ul>
</details>
<br />
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/checkout&package-manager=github_actions&previous-version=4.1.5&new-version=4.1.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
# Description
While each of the `help <subcommands>` in `std` had completers, there
wasn't one for the main `help` command.
This adds all internals and custom commands (as with `help commands`) as
possible completions.
# User-Facing Changes
`help ` + <kbd>Tab</kbd> will now suggest completions for both the `help
<subcommands>` as well as all internal and custom commands.
# Tests + Formatting
Note: Cannot add tests for completion functions since they are
module-internal and not visible to test cases, that I can see.
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
- **Remove unused `pathdiff` dep in `nu-cli`**
- **Remove unused `serde_json` dep on `nu-protocol`**
- Unnecessary after moving the plugin file to msgpack (still a
dev-dependency)
Some minor changes to `explore`, continuing on my mission to simplify
the command in preparation for a larger UX overhaul:
1. Consolidate padding configuration. I don't think we need separate
config points for the (optional) index column and regular data columns
in the normal pager, they can share padding configuration. Likewise, in
the binary viewer all 3 columns (index, data, ASCII) had their
left+right padding configured independently.
2. Update `explore` so we use the binary viewer for the new `ByteStream`
type. `cat foo.txt | into binary | explore` was not using the binary
viewer after the `ByteStream` changes.
3. Tweak the naming of a few helper functions, add a comment
I've put the changes in separate commits to make them easier to review.
---------
Co-authored-by: Stefan Holderbach <sholderbach@users.noreply.github.com>
# Description
Removes the old `nu-cmd-dataframe` crate in favor of the polars plugin.
As such, this PR also removes the `dataframe` feature, related CI, and
full releases of nushell.
# Description
This PR adds min and max to the bench command.
```nushell
❯ use std bench
❯ bench { dply -c 'parquet("./data.parquet") | group_by(year) | summarize(count = n(), sum = sum(geo_count)) | show()' | complete | null } --rounds 100 --verbose
100 / 100
╭───────┬───────────────────╮
│ mean │ 71ms 358µs 850ns │
│ min │ 66ms 457µs 583ns │
│ max │ 120ms 338µs 167ns │
│ std │ 6ms 553µs 949ns │
│ times │ [list 100 items] │
╰───────┴───────────────────╯
```
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
# Description
This PR allows byte streams to optionally be colored as being
specifically binary or string data, which guarantees that they'll be
converted to `Binary` or `String` appropriately on `into_value()`,
making them compatible with `Type` guarantees. This makes them
significantly more broadly usable for command input and output.
There is still an `Unknown` type for byte streams coming from external
commands, which uses the same behavior as we previously did where it's a
string if it's UTF-8.
A small number of commands were updated to take advantage of this, just
to prove the point. I will be adding more after this merges.
# User-Facing Changes
- New types in `describe`: `string (stream)`, `binary (stream)`
- These commands now return a stream if their input was a stream:
- `into binary`
- `into string`
- `bytes collect`
- `str join`
- `first` (binary)
- `last` (binary)
- `take` (binary)
- `skip` (binary)
- Streams that are explicitly binary colored will print as a streaming
hexdump
- example:
```nushell
1.. | each { into binary } | bytes collect
```
# Tests + Formatting
I've added some tests to cover it at a basic level, and it doesn't break
anything existing, but I do think more would be nice. Some of those will
come when I modify more commands to stream.
# After Submitting
There are a few things I'm not quite satisfied with:
- **String trimming behavior.** We automatically trim newlines from
streams from external commands, but I don't think we should do this with
internal commands. If I call a command that happens to turn my string
into a stream, I don't want the newline to suddenly disappear. I changed
this to specifically do it only on `Child` and `File`, but I don't know
if this is quite right, and maybe we should bring back the old flag for
`trim_end_newline`
- **Known binary always resulting in a hexdump.** It would be nice to
have a `print --raw`, so that we can put binary data on stdout
explicitly if we want to. This PR doesn't change how external commands
work though - they still dump straight to stdout.
Otherwise, here's the normal checklist:
- [ ] release notes
- [ ] docs update for plugin protocol changes (added `type` field)
---------
Co-authored-by: Ian Manske <ian.manske@pm.me>
# Description
Changes `get_full_help` to take a `&dyn Command` instead of multiple
arguments (`&Signature`, `&Examples` `is_parser_keyword`). All of these
arguments can be gathered from a `Command`, so there is no need to pass
the pieces to `get_full_help`.
This PR also fixes an issue where the search terms are not shown if
`--help` is used on a command.
# Description
There is a bug when `hide-env` is used on environment variables that
were present at shell startup. Namely, child processes still inherit the
hidden environment variable. This PR fixes#12900, fixes#11495, and
fixes#7937.
# Tests + Formatting
Added a test.
# Description
Kind of a vague title, but this PR does two main things:
1. Rather than overriding functions like `Command::is_parser_keyword`,
this PR instead changes commands to override `Command::command_type`.
The `CommandType` returned by `Command::command_type` is then used to
automatically determine whether `Command::is_parser_keyword` and the
other `is_{type}` functions should return true. These changes allow us
to remove the `CommandType::Other` case and should also guarantee than
only one of the `is_{type}` functions on `Command` will return true.
2. Uses the new, reworked `Command::command_type` function in the `scope
commands` and `which` commands.
# User-Facing Changes
- Breaking change for `scope commands`: multiple columns (`is_builtin`,
`is_keyword`, `is_plugin`, etc.) have been merged into the `type`
column.
- Breaking change: the `which` command can now report `plugin` or
`keyword` instead of `built-in` in the `type` column. It may also now
report `external` instead of `custom` in the `type` column for known
`extern`s.
# Description
This PR makes some commands and areas of code preserve pipeline
metadata. This is in an attempt to make the issue described in #12599
and #9456 less likely to occur. That is, reading and writing to the same
file in a pipeline will result in an empty file. Since we preserve
metadata in more places now, there will be a higher chance that we
successfully detect this error case and abort the pipeline.