Compare commits

...

91 Commits

Author SHA1 Message Date
JT
4e8e03867c Update release.yml 2022-01-19 04:51:21 +11:00
JT
49e8af8ea5 Bump to 0.43 (#4264) 2022-01-18 12:06:12 -05:00
JT
d5d61d14b3 Tutor eq (#4263)
* Fix clippy lints

* Fix clippy lints

* Fix clippy lints

* Add e-q tutor page
2022-01-19 03:22:23 +11:00
JT
f562a4526c Fix clippy lints (#4262)
* Fix clippy lints

* Fix clippy lints

* Fix clippy lints
2022-01-18 23:33:28 +11:00
e6c09f2dfc Update sysinfo version (#4261) 2022-01-18 22:37:52 +11:00
73a68954c4 Bump follow-redirects from 1.14.4 to 1.14.7 in /samples/wasm (#4258)
Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.14.4 to 1.14.7.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.14.4...v1.14.7)

---
updated-dependencies:
- dependency-name: follow-redirects
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-01-18 22:37:06 +11:00
476d543dee Update descriptions for crates split out from nu-cli (#4247)
`nu-command` and `nu-data` were split out, but the descriptions still
say 'CLI'.

Signed-off-by: Michel Alexandre Salim <salimma@fedoraproject.org>
2022-01-09 06:05:50 -06:00
398502b0d6 fix docs/sample_config/config.toml: use env.PROMPT_COMMAND (#4241) 2022-01-02 17:35:07 -06:00
JT
62011b6bcc Bump to 0.42 (#4234) 2021-12-28 20:56:59 +11:00
1214cd57e8 bat: use regex-onig instead of regex-fancy (#4226)
Fixes #4224

Signed-off-by: nibon7 <nibon7@163.com>
2021-12-24 08:34:59 -06:00
6cd124ddb2 allow insecure server connections when using SSL (#4219)
Fixes #4211

Signed-off-by: nibon7 <nibon7@163.com>
2021-12-23 06:48:43 +11:00
d32aec5906 Don't panic if the other end of std{out,err} is closed (#4179)
* fix #4161

println! and friends will panic on BrokenPipe. The solution is to use
writeln! instead, and ignore the error (or do we want to do something else?)

* test that nu doesn't panic in case of BrokenPipe error

* fixup! test that nu doesn't panic in case of BrokenPipe error

* make do_not_panic_if_broken_pipe only run on UNIX systems
2021-12-21 10:08:41 +11:00
e919f9a73b use heck for string casing (#4081)
I removed the Inflector dependency in favor of heck for two reasons:
- to close #3674.
- heck seems simpler and actively maintained

We could probably alter the structure of the `str_` module to expose the
individual casing behaviors better.
I did not feel as confident on changing those signatures.

So I took a lazier approach of a macro in the `mod.rs` that creates the public
shimming function to heck's traits.
2021-12-14 09:43:48 -06:00
a3c349746f ci: update macOS agent (#4207)
10.14 has been deprecated: https://github.com/Azure/azure-sdk-for-cpp/issues/3168

This hopefully fixes recent CI failures!
2021-12-14 08:55:51 -06:00
b5f8f64d79 ci: fix macOS agent (#4203)
I noticed the agent documentation uses uppercase: https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/hosted?view=azure-devops&tabs=yaml
2021-12-13 13:08:03 -06:00
1576b959f9 update feat template (#4201) 2021-12-12 16:15:31 -06:00
4096f52003 update templates2 (#4200) 2021-12-12 16:11:27 -06:00
7ceb668419 Revert "try out title change (#4198)" (#4199)
This reverts commit 420aee18ca.
2021-12-12 16:06:07 -06:00
420aee18ca try out title change (#4198) 2021-12-12 16:05:24 -06:00
pin
15e9c11849 Fix build on NetBSD (#4192) 2021-12-09 14:23:40 +02:00
9fd680ae2b fix: Implicit coercion of boolean false and empty value #4094 (#4120)
Signed-off-by: closetool <c299999999@qq.com>
2021-12-09 14:19:51 +02:00
ad94ed5e13 Fix Configuration section in bug report template (#4181)
* Fix Configuration section in bug report template

Change the placeholder content to actually match the `to md` output, and add `--pretty`

* Don't omit data in placeholder configuration table

* Remove blank line in bug_report.yml
2021-12-08 13:32:28 -06:00
1bdcdcca70 fix: change into column_path to into column-path (breaking change) (#4185) (#4189) 2021-12-08 11:04:55 +02:00
JT
610e3911f6 Bump to 0.41 (#4187) 2021-12-08 06:21:00 +13:00
ee9eddd851 avoid unnecessary allocation (#4178) 2021-12-06 07:38:58 +13:00
JT
c08e145501 Fix clippy warnings (#4176) 2021-12-03 07:05:38 +13:00
c00853a473 Seems like accessing $it outside each is not possible now (#4000) 2021-12-03 06:49:24 +13:00
79c7b20cfd add login shell flag (#4175) 2021-12-02 20:05:04 +13:00
JT
89cbfd758d Remove 'arboard' (#4174) 2021-12-02 08:48:03 +13:00
e6e6b730f3 Bye bye upx sorry (#4173)
* bye bye upx, let's try stripping alone

* remove all stripping - not sure it's even working
2021-11-30 13:34:16 -06:00
0fe6a7c1b5 bye bye upx, let's try stripping alone (#4172) 2021-11-30 12:11:01 -06:00
1794ad51bd Sanitize arguments to external commands a bit better (#4157)
* fix #4140

We are passing commands into a shell underneath but we were not
escaping arguments correctly. This new version of the code also takes
into consideration the ";" and "&" characters, which have special
meaning in shells.

We would probably benefit from a more robust way to join arguments to
shell programs. Python's stdlib has shlex.join, and perhaps we can
take that implementation as a reference.

* clean up escaping of posix shell args

I believe the right place to do escaping of arguments was in the
spawn_sh_command function. Note that this change prevents things like:

^echo "$(ls)"

from executing the ls command. Instead, this will just print

$(ls)

The regex has been taken from the python stdlib implementation of shlex.quote

* fix non-literal parameters and single quotes

* address clippy's comments

* fixup! address clippy's comments

* test that subshell commands are sanitized properly
2021-11-29 09:46:42 -06:00
fb197f562a save --append: create file if it doesn't exist (#4156)
* have save --append create file if not exists

Currently, doing:

echo a | save --raw --append file.txt

will fail if file.txt does not exist. This PR changes that

* test that `save --append` will create new file
2021-11-26 12:27:41 -06:00
91c270c14a fix markup (#4155) 2021-11-26 07:37:50 -06:00
3e93ae8af4 Correct spelling (#4152) 2021-11-25 11:11:20 -06:00
e06df124ca upgrading dependencies (#4135)
* upgrade dependencies
num-bigint 0.3.1 -> 0.4.3
bigdecimal-rs 0.2.1 -> bigdecimal 0.3.0
s3hander 0.7 -> 0.7.5
bat 0.18 -> 0.18, default-features = false

* upgrade arboard 1.1.0 -> 2.0.1

* in polars use comfy-table instead of prettytable-rs
the last release of prettytable-rs was `0.8.0 Sep 27, 2018`
and it uses `term 0.5` as a dependency

* upgrade dependencies

* upgrade trash -> 2.0.1

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-20 07:11:11 -06:00
JT
2590fcbe5c Bump to 0.40 (#4129) 2021-11-16 21:53:03 +13:00
JT
09691ff866 Delete docker-publish.yml 2021-11-16 14:19:35 +13:00
16db368232 upgrade polars to 0.17 (#4122) 2021-11-16 12:01:02 +13:00
JT
df87d90b8c Add 'detect columns' command (#4127)
* Add 'detect columns' command

* Fix warnings
2021-11-16 11:29:54 +13:00
f2f01b8a4d missed from_mp4, added back (#4128) 2021-11-15 16:19:44 -06:00
6c0190cd38 added upx and strip to mac and windows (#4126) 2021-11-15 15:32:48 -06:00
b26246bf12 trying upx and strip (#4125) 2021-11-15 15:01:25 -06:00
36a4effbb2 tweaked strip ci (#4124) 2021-11-15 14:30:32 -06:00
9fca417f8c update release to allow running manually (#4123) 2021-11-15 14:04:00 -06:00
d09e1148b2 add the ability to strip the debug symbols for smaller binaries on mac and linux 2021-11-15 13:47:46 -06:00
493bc2b1c9 Update README (#4118)
`winget install nu` fails because there's other options for "nu" now.
Using the full `nushell` word solved it for me.

[Imgur](https://imgur.com/aqz2qNp)
2021-11-14 19:34:57 +13:00
74b812228c upgrade dependencies (#4116)
* remove unused dependencies

* upgrade dependency bytes 0.5.6 -> 1.1.0

* upgrade dependency heapless 0.6.1 -> 0.7.8

* upgrade dependency image 0.22.4 -> 0.23.14

* upgrade dependency mp4 0.8.2 -> 0.9.0

* upgrade dependency bson 0.14.1 -> 2.0.1

Bson::Undefined, Bson::MaxKey, Bson::MinKey and Bson::DbPointer
weren't present in the previous version.

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-14 19:32:21 +13:00
649b3804c1 fix: panic! during parsing (#4107)
Typing `selector -qa` into nu would cause a `panic!`
This was the case because the inner loop incremented the `idx`
that was only checked in the outer loop and used it to index into
`lite_cmd.parts[idx]`
With the fix we now break loop.

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-05 21:46:46 +13:00
JT
df6a53f52e Update stale.yml (#4106) 2021-11-04 21:25:44 +13:00
JT
c4af5df828 Update stale.yml (#4102) 2021-10-31 16:48:58 +13:00
f94a3e15f5 Get rid of header bold option (#4076)
* refactor(options): get rid of 'header_bold' option

* docs(config): remove 'header_bold' from docs

* fix(options): replicate logic to apply true/false in bold

* style(options): apply lint fixes
2021-10-31 06:59:19 +13:00
75782f0f50 Fix #4070: Inconsistent file matching rule for ls and rm (#4099) 2021-10-28 15:05:07 +03:00
JT
2b06ce27d3 Bump to 0.39 (#4097) 2021-10-27 08:36:41 +13:00
72c241348b Remove dependencies (#4087)
* fix regression

* Removed the nipper dependency

* fix linting

* fix clippy
2021-10-22 06:58:40 +13:00
JT
ab2d2db987 Fix clippy warnings (#4088)
* Fix clippy warnings

* Fix clippy warnings
2021-10-22 06:57:51 +13:00
07e05ef183 fix regression (#4086) 2021-10-19 13:39:23 -05:00
a986de8ad0 Update stale.yml (#4073)
add labels that can exempt from stale bot
2021-10-09 14:50:27 -05:00
22cfe4391e remove history file after clearing it (#4069) 2021-10-07 10:09:31 -05:00
JT
97d17311f4 Update LICENSE (#4067) 2021-10-07 08:42:07 +13:00
0f6fd30619 stale.yml: mention time to close in stale message (#4066) 2021-10-06 09:05:29 -05:00
JT
e1ebd461d2 Bump to 0.28 (#4064) 2021-10-06 06:35:25 +13:00
JT
f000d5d0a1 Remove the broken scrolling support (#4063)
* Remove the broken scrolling support

* Remove the broken scrolling support
2021-10-06 05:57:14 +13:00
574c5961c8 Add -c flag to select command (#4062)
See cc3653cfd9 for more on the `-c` flag.

Co-authored-by: Andrés N. Robalino <andres@androbtech.com>

Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
2021-10-05 13:23:37 +13:00
JT
69708f7244 update wasm deps (#4061) 2021-10-03 07:19:54 +13:00
62c5df5fc6 expand tilde when reading plugin_dirs (#4052) 2021-10-02 21:38:21 +13:00
92c855a412 Fixed two typos in the tutor. (#4051) 2021-10-02 21:37:59 +13:00
d395816929 remove ansi colors if this is not a tty (#4058) 2021-10-01 09:00:08 -05:00
5e34ef6dff new command: into column_path (#4048) 2021-09-29 07:23:34 -05:00
d567c58cc1 Add -c flag to update cells subcommand (#4039)
* Add `-c` flag to `update cells` subcommand

* Fix lints
2021-09-27 21:18:50 -05:00
4e0d7bc77c Less deps (#4038)
* compiles on nightly now. (breaking change)

* less deps

* Switch over to new resolver

(it's been stable for a while.)

* let's leave num-format for another PR
2021-09-28 07:17:00 +13:00
32581497ef Fix 90 degrees tables problem (#4043)
* fix 90 degrees tables problem

* linting

* clippy

* linting
2021-09-25 14:05:45 -05:00
d6df367c6b Corrected typo (#4040)
It is not BSON but SQLite
2021-09-25 04:25:00 -05:00
4e6327de1d Added BigInt handling to the delimited file format for the 'to' command (#4034)
Co-authored-by: patrick <patrick@spol42069.hitronhub.home>
2021-09-25 09:47:16 +12:00
b3d8666db0 compiles on nightly now. (breaking change) (#4037) 2021-09-25 09:46:48 +12:00
1de7c3d033 Scraping multiple tables (#4036)
* Output error when ls into a file without permission

* math sqrt

* added test to check fails when ls into prohibited dir

* fix lint

* math sqrt with tests and doc

* trigger wasm build

* Update filesystem_shell.rs

* Fix Running echo .. starts printing integers forever

* Allow for multiple table scraping

* linting

* Fix clippy

* linting

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2021-09-24 08:08:13 -05:00
962b258cc6 merge span (#4031) 2021-09-23 07:48:05 +12:00
59697cab63 force rebuild of dev container (#4033) 2021-09-23 07:47:28 +12:00
349af05da8 Do not throw error for files not found in lib_dirs (#4029) 2021-09-20 13:44:47 -05:00
JT
b3b3cf0689 Remove the docker instructions
Docker has been out of date for a long time, go ahead and remove.
2021-09-20 19:33:49 +12:00
5d59234f8d Flexibility updating table's cells. (#4027)
Very often we need to work with tables (say extracted from unstructured data or some
kind of final report, timeseries, and the like).

It's inevitable we will be having columns that we can't know beforehand what their names
will be, or how many.

Also, we may end up with certain cells having values we may want to remove as we explore.

Here, `update cells` fundamentally goes over every cell in the table coming in and updates
the cell's contents with the output of the block passed. Basic example here:

```
> [

    [   ty1,       t2,       ty];

    [     1,        a, $nothing]
    [(wrap), (0..<10),      1Mb]
    [    1s,     ({}),  1000000]
    [ $true,   $false,   ([[]])]

] | update cells { describe }

───┬───────────────────────┬───────────────────────────┬──────────
 # │          ty1          │            t2             │    ty
───┼───────────────────────┼───────────────────────────┼──────────
 0 │ integer               │ string                    │ nothing
 1 │ row Column(table of ) │ range[[integer, integer)] │ filesize
 2 │ string                │ nothing                   │ integer
 3 │ boolean               │ boolean                   │ table of
───┴───────────────────────┴───────────────────────────┴──────────
```

and another one (in the examples) for cases, say we have a timeseries table generated and
we want to remove the zeros and have empty strings and save it out to something like CSV.

```
> [
    [2021-04-16, 2021-06-10, 2021-09-18, 2021-10-15, 2021-11-16, 2021-11-17, 2021-11-18];
    [        37,          0,          0,          0,         37,          0,          0]
] | update cells {|value| i
  if ($value | into int) == 0 {
    ""
  } {
    $value
  }
}

───┬────────────┬────────────┬────────────┬────────────┬────────────┬────────────┬────────────
 # │ 2021-04-16 │ 2021-06-10 │ 2021-09-18 │ 2021-10-15 │ 2021-11-16 │ 2021-11-17 │ 2021-11-18
───┼────────────┼────────────┼────────────┼────────────┼────────────┼────────────┼────────────
 0 │         37 │            │            │            │         37 │            │
───┴────────────┴────────────┴────────────┴────────────┴────────────┴────────────┴────────────
```
2021-09-19 15:37:54 -05:00
Tw
4f7b423f36 Support completion when cursor inside an argument (#4023)
* Support completion when cursor inside an argument

Bash supports completion even when cursor is in an argument, this is very useful for some fixup after the initial completion.
Let add this feature as well.

Signed-off-by: Tw <wei.tan@intel.com>

* Add test for when cursor inside an argument

To support test this case, let's also take the position into account.

Signed-off-by: Tw <wei.tan@intel.com>
2021-09-19 17:23:05 +12:00
f7043bf690 Fix #3090: let binding in command leaks when error occurs (#4022) 2021-09-19 14:57:20 +12:00
Tw
1297499d7a add command g to switch shell quickly (#4014)
Signed-off-by: Tw <tw19881113@gmail.com>
2021-09-17 10:39:14 +01:00
bd0baa961c add table selector for downloading web tables (#4004)
* add table selector for downloading web tables

* type-o

* updated debug mode to inspect mode
2021-09-16 09:02:30 -05:00
4ee536f044 fix: enable SIMD (#4021) 2021-09-16 20:01:42 +12:00
JT
8581bec891 bump 0.37.1 (#4019) 2021-09-16 13:32:22 +12:00
8bcbc8eeb3 Move nu-path tests to integration tests (#4015)
* Move nu-path tests to integration tests

To prevent circular dependency between nu-path and nu-test-support crates.

* Fmt
2021-09-16 07:11:28 +12:00
c164ef5489 Update to polars 0.16 (#4013)
* update to polars 0.16

* enabled features for polars
2021-09-16 07:10:12 +12:00
cc3653cfd9 Path commands: Put column path args behid flag; Allow path join appending without flag (#4008)
* Change path join signature

* Appending now works without flag
* Column path operation is behind a -c flag

* Move column path arg retrieval to a function

Also improves errors

* Fix path join tests

* Propagate column path changes to all path commands

* Update path command examples with columns paths

* Modernize path command examples by removing "echo"

* Improve structured path error message

* Fix typo
2021-09-15 21:03:51 +03:00
JT
7fc65067cf Temporarily remove the circular dep (#4009) 2021-09-15 09:17:31 +12:00
190 changed files with 4268 additions and 2759 deletions

View File

@ -16,7 +16,7 @@ strategy:
image: ubuntu-18.04
style: 'wasm'
macos-stable:
image: macos-10.14
image: macOS-10.15
style: 'unflagged'
windows-stable:
image: windows-2019

View File

@ -1,11 +1,11 @@
name: Bug Report
description: Create a report to help us improve
body:
body:
- type: textarea
id: description
attributes:
label: Describe the bug
description: A clear and concise description of what the bug is.
description: Thank you for your bug report. We are working diligently with our community to integrate our latest code base that we call [engine-q](https://github.com/nushell/engine-q). We would like your help with this by checking to see if this bug report is still needed in engine-q. Thank you for your patience while we ready the next version of nushell.
validations:
required: true
- type: textarea
@ -38,22 +38,20 @@ body:
id: config
attributes:
label: Configuration
description: "Please run `> version | pivot key value | to md` and paste the output to show OS, features, etc"
description: "Please run `version | pivot key value | to md --pretty` and paste the output to show OS, features, etc."
placeholder: |
> version | pivot key value | to md
╭───┬────────────────────┬───────────────────────────────────────────────────────────────────────╮
│ # │ key │ value │
├───┼────────────────────┼───────────────────────────────────────────────────────────────────────┤
│ 0 │ version │ 0.24.1
│ 1 │ build_os │ macos-x86_64 │
│ 2 │ rust_version │ rustc 1.48.0
│ 3 │ cargo_version │ cargo 1.48.0
│ 4 │ pkg_version │ 0.24.1
│ 5 │ build_time │ 2020-12-18 09:54:09 │
│ 6 │ build_rust_channel │ release
│ 7 │ features │ ctrlc, default, directories, dirs, git, ichwh, rich-benchmark,
│ │ │ rustyline, term, uuid, which, zip │
╰───┴────────────────────┴───────────────────────────────────────────────────────────────────────╯
> version | pivot key value | to md --pretty
| key | value |
| ------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| version | 0.40.0 |
| build_os | linux-x86_64 |
| rust_version | rustc 1.56.1 |
| cargo_version | cargo 1.56.0 |
| pkg_version | 0.40.0 |
| build_time | 1980-01-01 00:00:00 +00:00 |
| build_rust_channel | release |
| features | clipboard-cli, ctrlc, dataframe, default, rustyline, term, trash, uuid, which, zip |
| installed_plugins | binaryview, chart bar, chart line, fetch, from bson, from sqlite, inc, match, post, ps, query json, s3, selector, start, sys, textview, to bson, to sqlite, tree, xpath |
validations:
required: false
- type: textarea

View File

@ -5,7 +5,7 @@ body:
id: problem
attributes:
label: Related problem
description: Is your feature request related to a problem? Please describe.
description: Thank you for your feature request. We are working diligently with our community to integrate our latest code base that we call [engine-q](https://github.com/nushell/engine-q). We would like your help with this by checking to see if this feature request is still needed in engine-q. Thank you for your patience while we ready the next version of nushell.
placeholder: |
A clear and concise description of what the problem is.
Example: I am trying to do [...] but [...]

View File

@ -1,118 +0,0 @@
name: Publish consumable Docker images
on:
push:
tags: ['v?[0-9]+.[0-9]+.[0-9]+*']
jobs:
compile:
runs-on: ubuntu-latest
strategy:
matrix:
arch:
- x86_64-unknown-linux-musl
- x86_64-unknown-linux-gnu
steps:
- uses: actions/checkout@v2
- name: Install rust-embedded/cross
env: { VERSION: v0.1.16 }
run: >-
wget -nv https://github.com/rust-embedded/cross/releases/download/${VERSION}/cross-${VERSION}-x86_64-unknown-linux-gnu.tar.gz
-O- | sudo tar xz -C /usr/local/bin/
- name: compile for specific target
env: { arch: '${{ matrix.arch }}' }
run: |
cross build --target ${{ matrix.arch }} --release
# leave only the executable file
rm -frd target/${{ matrix.arch }}/release/{*/*,*.d,*.rlib,.fingerprint}
find . -empty -delete
- uses: actions/upload-artifact@master
with:
name: ${{ matrix.arch }}
path: target/${{ matrix.arch }}/release
docker:
name: Build and publish docker images
needs: compile
runs-on: ubuntu-latest
env:
DOCKER_REGISTRY: quay.io/nushell
DOCKER_PASSWORD: ${{ secrets.DOCKER_REGISTRY }}
DOCKER_USER: ${{ secrets.DOCKER_USER }}
strategy:
matrix:
tag:
- alpine
- slim
- debian
- glibc-busybox
- musl-busybox
- musl-distroless
- glibc-distroless
- glibc
- musl
include:
- { tag: alpine, base-image: alpine, arch: x86_64-unknown-linux-musl, plugin: true, use-patch: false}
- { tag: slim, base-image: 'debian:stable-slim', arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: debian, base-image: debian, arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: glibc-busybox, base-image: 'busybox:glibc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: musl-busybox, base-image: 'busybox:musl', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: musl-distroless, base-image: 'gcr.io/distroless/static', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: glibc-distroless, base-image: 'gcr.io/distroless/cc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: glibc, base-image: scratch, arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: false}
- { tag: musl, base-image: scratch, arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
steps:
- uses: actions/checkout@v2
- uses: actions/download-artifact@master
with: { name: '${{ matrix.arch }}', path: target/release }
- name: Build and publish exact version
run: |-
export DOCKER_TAG=${GITHUB_REF##*/}-${{ matrix.tag }}
export NU_BINS=target/release/$( [ ${{ matrix.plugin }} = true ] && echo nu* || echo nu )
export PATCH=$([ ${{ matrix.use-patch }} = true ] && echo .${{ matrix.tag }} || echo '')
chmod +x $NU_BINS
echo ${DOCKER_PASSWORD} | docker login ${DOCKER_REGISTRY} -u ${DOCKER_USER} --password-stdin
docker-compose --file docker/docker-compose.package.yml build
docker-compose --file docker/docker-compose.package.yml push # exact version
env:
BASE_IMAGE: ${{ matrix.base-image }}
#region semantics tagging
- name: Retag and push with suffixed version
run: |-
VERSION=${GITHUB_REF##*/}
latest_version=${VERSION%%%.*}-${{ matrix.tag }}
latest_feature=${VERSION%%.*}-${{ matrix.tag }}
latest_patch=${VERSION%.*}-${{ matrix.tag }}
exact_version=${VERSION}-${{ matrix.tag }}
tags=( ${latest_version} ${latest_feature} ${latest_patch} ${exact_version} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
docker push ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
- name: Retag and push debian as latest
if: matrix.tag == 'debian'
run: |-
VERSION=${GITHUB_REF##*/}
# ${latest features} ${latest patch} ${exact version}
tags=( ${VERSION%%.*} ${VERSION%.*} ${VERSION} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:latest
docker push ${DOCKER_REGISTRY}/nu:latest
#endregion semantics tagging

View File

@ -1,8 +1,9 @@
name: Create Release Draft
on:
workflow_dispatch:
push:
tags: ['[0-9]+.[0-9]+.[0-9]+*']
tags: ["[0-9]+.[0-9]+.[0-9]+*"]
jobs:
linux:
@ -28,6 +29,60 @@ jobs:
command: build
args: --release --all --features=extra
# - name: Strip binaries (nu)
# run: strip target/release/nu
# - name: Strip binaries (nu_plugin_inc)
# run: strip target/release/nu_plugin_inc
# - name: Strip binaries (nu_plugin_match)
# run: strip target/release/nu_plugin_match
# - name: Strip binaries (nu_plugin_textview)
# run: strip target/release/nu_plugin_textview
# - name: Strip binaries (nu_plugin_binaryview)
# run: strip target/release/nu_plugin_binaryview
# - name: Strip binaries (nu_plugin_chart_bar)
# run: strip target/release/nu_plugin_chart_bar
# - name: Strip binaries (nu_plugin_chart_line)
# run: strip target/release/nu_plugin_chart_line
# - name: Strip binaries (nu_plugin_from_bson)
# run: strip target/release/nu_plugin_from_bson
# - name: Strip binaries (nu_plugin_from_sqlite)
# run: strip target/release/nu_plugin_from_sqlite
# - name: Strip binaries (nu_plugin_from_mp4)
# run: strip target/release/nu_plugin_from_mp4
# - name: Strip binaries (nu_plugin_query_json)
# run: strip target/release/nu_plugin_query_json
# - name: Strip binaries (nu_plugin_s3)
# run: strip target/release/nu_plugin_s3
# - name: Strip binaries (nu_plugin_selector)
# run: strip target/release/nu_plugin_selector
# - name: Strip binaries (nu_plugin_start)
# run: strip target/release/nu_plugin_start
# - name: Strip binaries (nu_plugin_to_bson)
# run: strip target/release/nu_plugin_to_bson
# - name: Strip binaries (nu_plugin_to_sqlite)
# run: strip target/release/nu_plugin_to_sqlite
# - name: Strip binaries (nu_plugin_tree)
# run: strip target/release/nu_plugin_tree
# - name: Strip binaries (nu_plugin_xpath)
# run: strip target/release/nu_plugin_xpath
- name: Create output directory
run: mkdir output
@ -70,6 +125,60 @@ jobs:
command: build
args: --release --all --features=extra
# - name: Strip binaries (nu)
# run: strip target/release/nu
# - name: Strip binaries (nu_plugin_inc)
# run: strip target/release/nu_plugin_inc
# - name: Strip binaries (nu_plugin_match)
# run: strip target/release/nu_plugin_match
# - name: Strip binaries (nu_plugin_textview)
# run: strip target/release/nu_plugin_textview
# - name: Strip binaries (nu_plugin_binaryview)
# run: strip target/release/nu_plugin_binaryview
# - name: Strip binaries (nu_plugin_chart_bar)
# run: strip target/release/nu_plugin_chart_bar
# - name: Strip binaries (nu_plugin_chart_line)
# run: strip target/release/nu_plugin_chart_line
# - name: Strip binaries (nu_plugin_from_bson)
# run: strip target/release/nu_plugin_from_bson
# - name: Strip binaries (nu_plugin_from_sqlite)
# run: strip target/release/nu_plugin_from_sqlite
# - name: Strip binaries (nu_plugin_from_mp4)
# run: strip target/release/nu_plugin_from_mp4
# - name: Strip binaries (nu_plugin_query_json)
# run: strip target/release/nu_plugin_query_json
# - name: Strip binaries (nu_plugin_s3)
# run: strip target/release/nu_plugin_s3
# - name: Strip binaries (nu_plugin_selector)
# run: strip target/release/nu_plugin_selector
# - name: Strip binaries (nu_plugin_start)
# run: strip target/release/nu_plugin_start
# - name: Strip binaries (nu_plugin_to_bson)
# run: strip target/release/nu_plugin_to_bson
# - name: Strip binaries (nu_plugin_to_sqlite)
# run: strip target/release/nu_plugin_to_sqlite
# - name: Strip binaries (nu_plugin_tree)
# run: strip target/release/nu_plugin_tree
# - name: Strip binaries (nu_plugin_xpath)
# run: strip target/release/nu_plugin_xpath
- name: Create output directory
run: mkdir output
@ -106,7 +215,7 @@ jobs:
uses: actions-rs/cargo@v1
with:
command: install
args: cargo-wix
args: cargo-wix --version 0.3.1
- name: Build
uses: actions-rs/cargo@v1
@ -114,6 +223,60 @@ jobs:
command: build
args: --release --all --features=extra
# - name: Strip binaries (nu.exe)
# run: strip target/release/nu.exe
# - name: Strip binaries (nu_plugin_inc.exe)
# run: strip target/release/nu_plugin_inc.exe
# - name: Strip binaries (nu_plugin_match.exe)
# run: strip target/release/nu_plugin_match.exe
# - name: Strip binaries (nu_plugin_textview.exe)
# run: strip target/release/nu_plugin_textview.exe
# - name: Strip binaries (nu_plugin_binaryview.exe)
# run: strip target/release/nu_plugin_binaryview.exe
# - name: Strip binaries (nu_plugin_chart_bar.exe)
# run: strip target/release/nu_plugin_chart_bar.exe
# - name: Strip binaries (nu_plugin_chart_line.exe)
# run: strip target/release/nu_plugin_chart_line.exe
# - name: Strip binaries (nu_plugin_from_bson.exe)
# run: strip target/release/nu_plugin_from_bson.exe
# - name: Strip binaries (nu_plugin_from_sqlite.exe)
# run: strip target/release/nu_plugin_from_sqlite.exe
# - name: Strip binaries (nu_plugin_from_mp4.exe)
# run: strip target/release/nu_plugin_from_mp4.exe
# - name: Strip binaries (nu_plugin_query_json.exe)
# run: strip target/release/nu_plugin_query_json.exe
# - name: Strip binaries (nu_plugin_s3.exe)
# run: strip target/release/nu_plugin_s3.exe
# - name: Strip binaries (nu_plugin_selector.exe)
# run: strip target/release/nu_plugin_selector.exe
# - name: Strip binaries (nu_plugin_start.exe)
# run: strip target/release/nu_plugin_start.exe
# - name: Strip binaries (nu_plugin_to_bson.exe)
# run: strip target/release/nu_plugin_to_bson.exe
# - name: Strip binaries (nu_plugin_to_sqlite.exe)
# run: strip target/release/nu_plugin_to_sqlite.exe
# - name: Strip binaries (nu_plugin_tree.exe)
# run: strip target/release/nu_plugin_tree.exe
# - name: Strip binaries (nu_plugin_xpath.exe)
# run: strip target/release/nu_plugin_xpath.exe
- name: Create output directory
run: mkdir output
@ -274,7 +437,7 @@ jobs:
with:
name: windows-installer
path: ./
- name: Upload Windows installer
uses: actions/upload-release-asset@v1
env:

View File

@ -19,11 +19,10 @@ jobs:
operations-per-run: 520
enable-statistics: true
repo-token: ${{ secrets.GITHUB_TOKEN }}
stale-issue-message: 'This issue is being marked stale because it has been open for 90 days without activity. If you feel that this is in error, please comment below and we will keep it marked as active.'
stale-pr-message: 'This PR is being marked stale because it has been open for 45 days without activity. If this PR is still active, please comment below and we will keep it marked as active.'
close-issue-message: 'This issue has been marked stale for more than 10 days without activity. Closing this issue, but if you find that the issue is still valid, please reopen.'
close-pr-message: 'This PR has been marked stale for more than 10 days without activity. Closing this PR, but if you are still working on it, please reopen.'
close-issue-message: 'This issue has been marked stale for more than 100000 days without activity. Closing this issue, but if you find that the issue is still valid, please reopen.'
close-pr-message: 'This PR has been marked stale for more than 100 days without activity. Closing this PR, but if you are still working on it, please reopen.'
days-before-issue-stale: 90
days-before-pr-stale: 45
days-before-issue-close: 10
days-before-pr-close: 10
days-before-issue-close: 100000
days-before-pr-close: 100
exempt-issue-labels: 'exempt,keep'

2
.gitpod.Dockerfile vendored
View File

@ -2,7 +2,7 @@ FROM gitpod/workspace-full
# Gitpod will not rebuild Nushell's dev image unless *some* change is made to this Dockerfile.
# To force a rebuild, simply increase this counter:
ENV TRIGGER_REBUILD 1
ENV TRIGGER_REBUILD 2
USER gitpod

1705
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -10,7 +10,7 @@ license = "MIT"
name = "nu"
readme = "README.md"
repository = "https://github.com/nushell/nushell"
version = "0.37.0"
version = "0.43.0"
[workspace]
members = ["crates/*/"]
@ -18,34 +18,34 @@ members = ["crates/*/"]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
nu-cli = { version = "0.37.0", path="./crates/nu-cli", default-features=false }
nu-command = { version = "0.37.0", path="./crates/nu-command" }
nu-completion = { version = "0.37.0", path="./crates/nu-completion" }
nu-data = { version = "0.37.0", path="./crates/nu-data" }
nu-engine = { version = "0.37.0", path="./crates/nu-engine" }
nu-errors = { version = "0.37.0", path="./crates/nu-errors" }
nu-parser = { version = "0.37.0", path="./crates/nu-parser" }
nu-path = { version = "0.37.0", path="./crates/nu-path" }
nu-plugin = { version = "0.37.0", path="./crates/nu-plugin" }
nu-protocol = { version = "0.37.0", path="./crates/nu-protocol" }
nu-source = { version = "0.37.0", path="./crates/nu-source" }
nu-value-ext = { version = "0.37.0", path="./crates/nu-value-ext" }
nu-cli = { version = "0.43.0", path="./crates/nu-cli", default-features=false }
nu-command = { version = "0.43.0", path="./crates/nu-command" }
nu-completion = { version = "0.43.0", path="./crates/nu-completion" }
nu-data = { version = "0.43.0", path="./crates/nu-data" }
nu-engine = { version = "0.43.0", path="./crates/nu-engine" }
nu-errors = { version = "0.43.0", path="./crates/nu-errors" }
nu-parser = { version = "0.43.0", path="./crates/nu-parser" }
nu-path = { version = "0.43.0", path="./crates/nu-path" }
nu-plugin = { version = "0.43.0", path="./crates/nu-plugin" }
nu-protocol = { version = "0.43.0", path="./crates/nu-protocol" }
nu-source = { version = "0.43.0", path="./crates/nu-source" }
nu-value-ext = { version = "0.43.0", path="./crates/nu-value-ext" }
nu_plugin_binaryview = { version = "0.37.0", path="./crates/nu_plugin_binaryview", optional=true }
nu_plugin_chart = { version = "0.37.0", path="./crates/nu_plugin_chart", optional=true }
nu_plugin_from_bson = { version = "0.37.0", path="./crates/nu_plugin_from_bson", optional=true }
nu_plugin_from_sqlite = { version = "0.37.0", path="./crates/nu_plugin_from_sqlite", optional=true }
nu_plugin_inc = { version = "0.37.0", path="./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.37.0", path="./crates/nu_plugin_match", optional=true }
nu_plugin_query_json = { version = "0.37.0", path="./crates/nu_plugin_query_json", optional=true }
nu_plugin_s3 = { version = "0.37.0", path="./crates/nu_plugin_s3", optional=true }
nu_plugin_selector = { version = "0.37.0", path="./crates/nu_plugin_selector", optional=true }
nu_plugin_start = { version = "0.37.0", path="./crates/nu_plugin_start", optional=true }
nu_plugin_textview = { version = "0.37.0", path="./crates/nu_plugin_textview", optional=true }
nu_plugin_to_bson = { version = "0.37.0", path="./crates/nu_plugin_to_bson", optional=true }
nu_plugin_to_sqlite = { version = "0.37.0", path="./crates/nu_plugin_to_sqlite", optional=true }
nu_plugin_tree = { version = "0.37.0", path="./crates/nu_plugin_tree", optional=true }
nu_plugin_xpath = { version = "0.37.0", path="./crates/nu_plugin_xpath", optional=true }
nu_plugin_binaryview = { version = "0.43.0", path="./crates/nu_plugin_binaryview", optional=true }
nu_plugin_chart = { version = "0.43.0", path="./crates/nu_plugin_chart", optional=true }
nu_plugin_from_bson = { version = "0.43.0", path="./crates/nu_plugin_from_bson", optional=true }
nu_plugin_from_sqlite = { version = "0.43.0", path="./crates/nu_plugin_from_sqlite", optional=true }
nu_plugin_inc = { version = "0.43.0", path="./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.43.0", path="./crates/nu_plugin_match", optional=true }
nu_plugin_query_json = { version = "0.43.0", path="./crates/nu_plugin_query_json", optional=true }
nu_plugin_s3 = { version = "0.43.0", path="./crates/nu_plugin_s3", optional=true }
nu_plugin_selector = { version = "0.43.0", path="./crates/nu_plugin_selector", optional=true }
nu_plugin_start = { version = "0.43.0", path="./crates/nu_plugin_start", optional=true }
nu_plugin_textview = { version = "0.43.0", path="./crates/nu_plugin_textview", optional=true }
nu_plugin_to_bson = { version = "0.43.0", path="./crates/nu_plugin_to_bson", optional=true }
nu_plugin_to_sqlite = { version = "0.43.0", path="./crates/nu_plugin_to_sqlite", optional=true }
nu_plugin_tree = { version = "0.43.0", path="./crates/nu_plugin_tree", optional=true }
nu_plugin_xpath = { version = "0.43.0", path="./crates/nu_plugin_xpath", optional=true }
# Required to bootstrap the main binary
ctrlc = { version="3.1.7", optional=true }
@ -53,7 +53,7 @@ futures = { version="0.3.12", features=["compat", "io-compat"] }
itertools = "0.10.0"
[dev-dependencies]
nu-test-support = { version = "0.37.0", path="./crates/nu-test-support" }
nu-test-support = { version = "0.43.0", path="./crates/nu-test-support" }
serial_test = "0.5.1"
hamcrest2 = "0.3.0"
rstest = "0.10.0"
@ -89,7 +89,6 @@ extra = [
"inc",
"tree",
"textview",
"clipboard-cli",
"trash-support",
"uuid-support",
"start",
@ -113,7 +112,6 @@ textview = ["nu_plugin_textview"]
binaryview = ["nu_plugin_binaryview"]
bson = ["nu_plugin_from_bson", "nu_plugin_to_bson"]
chart = ["nu_plugin_chart"]
clipboard-cli = ["nu-command/clipboard-cli"]
query-json = ["nu_plugin_query_json"]
s3 = ["nu_plugin_s3"]
selector = ["nu_plugin_selector"]
@ -127,9 +125,6 @@ tree = ["nu_plugin_tree"]
xpath = ["nu_plugin_xpath"]
zip-support = ["nu-command/zip"]
#This is disabled in extra for now
table-pager = ["nu-command/table-pager"]
#dataframe feature for nushell
dataframe = [
"nu-engine/dataframe",
@ -141,7 +136,7 @@ dataframe = [
]
[profile.release]
opt-level = "z" # Optimize for size.
opt-level = "s" # Optimize for size.
# Core plugins that ship with `cargo install nu` by default
# Currently, Cargo limits us to installing only one binary

View File

@ -1,6 +1,6 @@
MIT License
Copyright (c) 2019 - 2021 Yehuda Katz, Jonathan Turner
Copyright (c) 2019 - 2021 Nushell Project
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

View File

@ -68,7 +68,7 @@ cargo install nu
To install Nu via the [Windows Package Manager](https://aka.ms/winget-cli):
```shell
winget install nu
winget install nushell
```
You can also build Nu yourself with all the bells and whistles (be sure to have installed the [dependencies](https://www.nushell.sh/book/installation.html#dependencies) for your platform), once you have checked out this repo with git:
@ -76,53 +76,6 @@ You can also build Nu yourself with all the bells and whistles (be sure to have
```shell
cargo build --workspace --features=extra
```
### Docker
#### Quickstart
Want to try Nu right away? Execute the following to get started.
```shell
docker run -it quay.io/nushell/nu:latest
```
#### Guide
If you want to pull a pre-built container, you can browse tags for the [nushell organization](https://quay.io/organization/nushell)
on Quay.io. Pulling a container would come down to:
```shell
docker pull quay.io/nushell/nu
docker pull quay.io/nushell/nu-base
```
Both "nu-base" and "nu" provide the nu binary, however, nu-base also includes the source code at `/code`
in the container and all dependencies.
Optionally, you can also build the containers locally using the [dockerfiles provided](docker):
To build the base image:
```shell
docker build -f docker/Dockerfile.nu-base -t nushell/nu-base .
```
And then to build the smaller container (using a Multistage build):
```shell
docker build -f docker/Dockerfile -t nushell/nu .
```
Either way, you can run either container as follows:
```shell
docker run -it nushell/nu-base
docker run -it nushell/nu
/> exit
```
The second container is a bit smaller if the size is important to you.
### Packaging status
[![Packaging status](https://repology.org/badge/vertical-allrepos/nushell.svg)](https://repology.org/project/nushell/versions)

View File

@ -10,4 +10,4 @@ Foundational libraries are split into two kinds of crates:
Plugins are likewise also split into two types:
* Core plugins - plugins that provide part of the default experience of Nu, including access to the system properties, processes, and web-connectivity features.
* Extra plugins - these plugins run a wide range of differnt capabilities like working with different file types, charting, viewing binary data, and more.
* Extra plugins - these plugins run a wide range of different capabilities like working with different file types, charting, viewing binary data, and more.

View File

@ -9,7 +9,7 @@ description = "Library for ANSI terminal colors and styles (bold, underline)"
edition = "2018"
license = "MIT"
name = "nu-ansi-term"
version = "0.37.0"
version = "0.43.0"
[lib]
doctest = false
@ -21,7 +21,6 @@ derive_serde_style = ["serde"]
[dependencies]
overload = "0.1.1"
serde = { version="1.0.90", features=["derive"], optional=true }
itertools = "0.10.0"
# [dependencies.serde]
# version = "1.0.90"

View File

@ -231,7 +231,6 @@
#![crate_name = "nu_ansi_term"]
#![crate_type = "rlib"]
#![crate_type = "dylib"]
#![warn(missing_copy_implementations)]
// #![warn(missing_docs)]
#![warn(trivial_casts, trivial_numeric_casts)]

View File

@ -613,7 +613,7 @@ mod serde_json_tests {
let serialized = serde_json::to_string(&color).unwrap();
let deserialized: Color = serde_json::from_str(&serialized).unwrap();
assert_eq!(color, &deserialized);
assert_eq!(color, deserialized);
}
}

View File

@ -4,23 +4,24 @@ description = "CLI for nushell"
edition = "2018"
license = "MIT"
name = "nu-cli"
version = "0.37.0"
version = "0.43.0"
build = "build.rs"
[lib]
doctest = false
[dependencies]
nu-completion = { version = "0.37.0", path="../nu-completion" }
nu-command = { version = "0.37.0", path="../nu-command" }
nu-data = { version = "0.37.0", path="../nu-data" }
nu-engine = { version = "0.37.0", path="../nu-engine" }
nu-errors = { version = "0.37.0", path="../nu-errors" }
nu-parser = { version = "0.37.0", path="../nu-parser" }
nu-protocol = { version = "0.37.0", path="../nu-protocol" }
nu-source = { version = "0.37.0", path="../nu-source" }
nu-stream = { version = "0.37.0", path="../nu-stream" }
nu-ansi-term = { version = "0.37.0", path="../nu-ansi-term" }
nu-completion = { version = "0.43.0", path="../nu-completion" }
nu-command = { version = "0.43.0", path="../nu-command" }
nu-data = { version = "0.43.0", path="../nu-data" }
nu-engine = { version = "0.43.0", path="../nu-engine" }
nu-errors = { version = "0.43.0", path="../nu-errors" }
nu-parser = { version = "0.43.0", path="../nu-parser" }
nu-protocol = { version = "0.43.0", path="../nu-protocol" }
nu-source = { version = "0.43.0", path="../nu-source" }
nu-stream = { version = "0.43.0", path="../nu-stream" }
nu-ansi-term = { version = "0.43.0", path="../nu-ansi-term" }
nu-path = { version = "0.43.0", path="../nu-path" }
indexmap ="1.6.1"
log = "0.4.14"
@ -28,13 +29,13 @@ pretty_env_logger = "0.4.0"
strip-ansi-escapes = "0.1.0"
rustyline = { version="9.0.0", optional=true }
ctrlc = { version="3.1.7", optional=true }
shadow-rs = { version="0.6", default-features=false, optional=true }
shadow-rs = { version = "0.8.1", default-features = false, optional = true }
serde = { version="1.0.123", features=["derive"] }
serde_yaml = "0.8.16"
lazy_static = "1.4.0"
[build-dependencies]
shadow-rs = "0.6"
shadow-rs = "0.8.1"
[features]
default = ["shadow-rs"]

View File

@ -513,11 +513,6 @@ mod tests {
let args = format!("nu --loglevel={}", level);
ui.parse(&args)?;
assert_eq!(ui.loglevel().unwrap(), Ok(level.to_string()));
let ui = cli_app();
let args = format!("nu -l {}", level);
ui.parse(&args)?;
assert_eq!(ui.loglevel().unwrap(), Ok(level.to_string()));
}
let ui = cli_app();
@ -530,6 +525,17 @@ mod tests {
Ok(())
}
#[test]
fn can_be_login() -> Result<(), ShellError> {
let ui = cli_app();
ui.parse("nu -l")?;
let ui = cli_app();
ui.parse("nu --login")?;
Ok(())
}
#[test]
fn can_be_passed_nu_scripts() -> Result<(), ShellError> {
let ui = cli_app();

View File

@ -24,6 +24,7 @@ use rustyline::{self, error::ReadlineError};
use nu_errors::ShellError;
use nu_parser::ParserScope;
use nu_path::expand_tilde;
use nu_protocol::{hir::ExternalRedirection, ConfigPath, UntaggedValue, Value};
use log::trace;
@ -54,7 +55,7 @@ pub fn search_paths() -> Vec<std::path::PathBuf> {
{
for pipeline in pipelines {
if let Ok(plugin_dir) = pipeline.as_string() {
search_paths.push(PathBuf::from(plugin_dir));
search_paths.push(expand_tilde(plugin_dir));
}
}
}
@ -90,10 +91,10 @@ pub fn run_script_file(
fn default_prompt_string(cwd: &str) -> String {
format!(
"{}{}{}{}{}{}> ",
Color::Green.bold().prefix().to_string(),
Color::Green.bold().prefix(),
cwd,
nu_ansi_term::ansi::RESET,
Color::Cyan.bold().prefix().to_string(),
Color::Cyan.bold().prefix(),
current_branch(),
nu_ansi_term::ansi::RESET
)
@ -371,7 +372,7 @@ pub fn cli(
LineResult::ClearHistory => {
if options.save_history {
rl.clear_history();
let _ = rl.append_history(&history_path);
std::fs::remove_file(&history_path)?;
}
}

View File

@ -1,58 +1,51 @@
[package]
authors = ["The Nu Project Contributors"]
build = "build.rs"
description = "CLI for nushell"
description = "Commands for Nushell"
edition = "2018"
license = "MIT"
name = "nu-command"
version = "0.37.0"
version = "0.43.0"
[lib]
doctest = false
[dependencies]
nu-data = { version = "0.37.0", path="../nu-data" }
nu-engine = { version = "0.37.0", path="../nu-engine" }
nu-errors = { version = "0.37.0", path="../nu-errors" }
nu-json = { version = "0.37.0", path="../nu-json" }
nu-path = { version = "0.37.0", path="../nu-path" }
nu-parser = { version = "0.37.0", path="../nu-parser" }
nu-plugin = { version = "0.37.0", path="../nu-plugin" }
nu-protocol = { version = "0.37.0", path="../nu-protocol" }
nu-serde = { version = "0.37.0", path="../nu-serde" }
nu-source = { version = "0.37.0", path="../nu-source" }
nu-stream = { version = "0.37.0", path="../nu-stream" }
nu-table = { version = "0.37.0", path="../nu-table" }
nu-test-support = { version = "0.37.0", path="../nu-test-support" }
nu-value-ext = { version = "0.37.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.37.0", path="../nu-ansi-term" }
nu-pretty-hex = { version = "0.37.0", path="../nu-pretty-hex" }
nu-data = { version = "0.43.0", path="../nu-data" }
nu-engine = { version = "0.43.0", path="../nu-engine" }
nu-errors = { version = "0.43.0", path="../nu-errors" }
nu-json = { version = "0.43.0", path="../nu-json" }
nu-path = { version = "0.43.0", path="../nu-path" }
nu-parser = { version = "0.43.0", path="../nu-parser" }
nu-plugin = { version = "0.43.0", path="../nu-plugin" }
nu-protocol = { version = "0.43.0", path="../nu-protocol" }
nu-serde = { version = "0.43.0", path="../nu-serde" }
nu-source = { version = "0.43.0", path="../nu-source" }
nu-stream = { version = "0.43.0", path="../nu-stream" }
nu-table = { version = "0.43.0", path="../nu-table" }
nu-test-support = { version = "0.43.0", path="../nu-test-support" }
nu-value-ext = { version = "0.43.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.43.0", path="../nu-ansi-term" }
nu-pretty-hex = { version = "0.43.0", path="../nu-pretty-hex" }
url = "2.2.1"
mime = "0.3.16"
Inflector = "0.11"
arboard = { version="1.1.0", optional=true }
heck = "0.4.0"
base64 = "0.13.0"
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] }
byte-unit = "4.0.9"
bytes = "1.0.1"
bigdecimal = { version = "0.3.0", features = ["serde"] }
calamine = "0.18.0"
chrono = { version="0.4.19", features=["serde"] }
chrono-tz = "0.5.3"
codespan-reporting = "0.11.0"
crossterm = { version="0.19.0", optional=true }
csv = "1.1.3"
ctrlc = { version="3.1.7", optional=true }
derive-new = "0.5.8"
directories-next = "2.0.0"
dirs-next = "2.0.0"
dtparse = "1.2.0"
eml-parser = "0.1.0"
encoding_rs = "0.8.28"
filesize = "0.2.0"
fs_extra = "1.2.0"
futures = { version="0.3.12", features=["compat", "io-compat"] }
getset = "0.1.1"
glob = "0.3.0"
htmlescape = "0.3.1"
ical = "0.7.0"
@ -62,41 +55,32 @@ lazy_static = "1.*"
log = "0.4.14"
md-5 = "0.9.1"
meval = "0.2.0"
minus = { version="3.4.0", optional=true, features=["async_std_lib", "search"] }
num-bigint = { version="0.3.1", features=["serde"] }
num-bigint = { version="0.4.3", features=["serde"] }
num-format = { version="0.4.0", features=["with-num-bigint"] }
num-traits = "0.2.14"
parking_lot = "0.11.1"
pin-utils = "0.1.0"
query_interface = "0.3.5"
quick-xml = "0.22"
rand = "0.8"
rayon = "1.5.0"
regex = "1.4.3"
reqwest = {version = "0.11", optional = true }
roxmltree = "0.14.0"
rust-embed = "5.9.0"
rustyline = { version="9.0.0", optional=true }
serde = { version="1.0.123", features=["derive"] }
serde_bytes = "0.11.5"
serde_ini = "0.2.0"
serde_json = "1.0.61"
serde_urlencoded = "0.7.0"
serde_yaml = "0.8.16"
sha2 = "0.9.3"
strip-ansi-escapes = "0.1.0"
sxd-document = "0.3.2"
sxd-xpath = "0.4.2"
sysinfo = { version = "0.20.2", optional = true }
sysinfo = { version = "0.23.0", optional = true }
thiserror = "1.0.26"
tempfile = "3.2.0"
term = { version="0.7.0", optional=true }
term_size = "0.3.2"
termcolor = "1.1.2"
titlecase = "1.1.0"
tokio = { version = "1", features = ["rt-multi-thread"], optional = true }
toml = "0.5.8"
trash = { version="1.3.0", optional=true }
trash = { version = "2.0.2", optional = true }
unicode-segmentation = "1.8"
uuid_crate = { package="uuid", version="0.8.2", features=["v4"], optional=true }
which = { version="4.1.0", optional=true }
@ -104,9 +88,10 @@ zip = { version="0.5.9", optional=true }
digest = "0.9.0"
[dependencies.polars]
version = "0.15.1"
version = "0.17.0"
optional = true
features = ["parquet", "json", "random", "pivot", "strings", "is_in", "temporal"]
default-features = false
features = ["docs", "zip_with", "csv-file", "temporal", "performant", "pretty_fmt", "dtype-slim", "parquet", "json", "random", "pivot", "strings", "is_in", "cum_agg", "rolling_window"]
[target.'cfg(unix)'.dependencies]
umask = "1.0.0"
@ -115,16 +100,11 @@ users = "0.11.0"
# TODO this will be possible with new dependency resolver
# (currently on nightly behind -Zfeatures=itarget):
# https://github.com/rust-lang/cargo/issues/7914
#[target.'cfg(not(windows))'.dependencies]
#num-format = {version = "0.4", features = ["with-system-locale"]}
[dependencies.rusqlite]
features = ["bundled", "blob"]
optional = true
version = "0.25.3"
# [target.'cfg(not(windows))'.dependencies]
# num-format = { version = "0.4", features = ["with-system-locale"] }
[build-dependencies]
shadow-rs = "0.6"
shadow-rs = "0.8.1"
[dev-dependencies]
quickcheck = "1.0.3"
@ -132,13 +112,11 @@ quickcheck_macros = "1.0.0"
hamcrest2 = "0.3.0"
[features]
clipboard-cli = ["arboard"]
rustyline-support = ["rustyline"]
stable = []
trash-support = ["trash"]
table-pager = ["minus", "crossterm"]
dataframe = ["nu-protocol/dataframe", "polars"]
fetch = ["reqwest", "tokio"]
post = ["reqwest", "tokio"]
sys = ["sysinfo"]
ps = ["sysinfo"]
ps = ["sysinfo"]

View File

@ -1,7 +0,0 @@
use derive_new::new;
use nu_protocol::hir;
#[derive(new, Debug)]
pub(crate) struct Command {
pub(crate) args: hir::Call,
}

View File

@ -1,8 +1,10 @@
use crate::prelude::*;
use lazy_static::lazy_static;
use nu_engine::{evaluate_baseline_expr, BufCodecReader};
use nu_engine::{MaybeTextCodec, StringOrBinary};
use nu_test_support::NATIVE_PATH_ENV_VAR;
use parking_lot::Mutex;
use regex::Regex;
#[allow(unused)]
use std::env;
@ -44,20 +46,16 @@ pub(crate) fn run_external_command(
}
#[allow(unused)]
fn trim_double_quotes(input: &str) -> String {
fn trim_enclosing_quotes(input: &str) -> String {
let mut chars = input.chars();
match (chars.next(), chars.next_back()) {
(Some('"'), Some('"')) => chars.collect(),
(Some('\''), Some('\'')) => chars.collect(),
_ => input.to_string(),
}
}
#[allow(unused)]
fn escape_where_needed(input: &str) -> String {
input.split(' ').join("\\ ").split('\'').join("\\'")
}
fn run_with_stdin(
command: ExternalCommand,
context: &mut EvaluationContext,
@ -115,15 +113,9 @@ fn run_with_stdin(
#[cfg(not(windows))]
{
if !_is_literal {
let escaped = escape_double_quotes(&arg);
add_double_quotes(&escaped)
arg
} else {
let trimmed = trim_double_quotes(&arg);
if trimmed != arg {
escape_where_needed(&trimmed)
} else {
trimmed
}
trim_enclosing_quotes(&arg)
}
}
#[cfg(windows)]
@ -131,7 +123,7 @@ fn run_with_stdin(
if let Some(unquoted) = remove_quotes(&arg) {
unquoted.to_string()
} else {
arg.to_string()
arg
}
}
})
@ -172,9 +164,29 @@ fn spawn_cmd_command(command: &ExternalCommand, args: &[String]) -> Command {
process
}
fn has_unsafe_shell_characters(arg: &str) -> bool {
lazy_static! {
static ref RE: Regex = Regex::new(r"[^\w@%+=:,./-]").expect("regex to be valid");
}
RE.is_match(arg)
}
fn shell_arg_escape(arg: &str) -> String {
match arg {
"" => String::from("''"),
s if !has_unsafe_shell_characters(s) => String::from(s),
_ => {
let single_quotes_escaped = arg.split('\'').join("'\"'\"'");
format!("'{}'", single_quotes_escaped)
}
}
}
/// Spawn a sh command with `sh -c args...`
fn spawn_sh_command(command: &ExternalCommand, args: &[String]) -> Command {
let cmd_with_args = vec![command.name.clone(), args.join(" ")].join(" ");
let joined_and_escaped_arguments = args.iter().map(|arg| shell_arg_escape(arg)).join(" ");
let cmd_with_args = vec![command.name.clone(), joined_and_escaped_arguments].join(" ");
let mut process = Command::new("sh");
process.arg("-c").arg(cmd_with_args);
process

View File

@ -1,5 +1 @@
mod dynamic;
pub(crate) mod external;
#[allow(unused_imports)]
pub(crate) use dynamic::Command as DynamicCommand;

View File

@ -38,7 +38,7 @@ impl WholeStreamCommand for SubCommand {
},
Example {
description: "Set coloring options",
example: "config set color_config [[header_align header_bold]; [left $true]]",
example: "config set color_config [[header_align header_color]; [left white_bold]]",
result: None,
},
Example {

View File

@ -0,0 +1,118 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{ColumnPath, Primitive, Signature, SyntaxShape, UntaggedValue, Value};
pub struct SubCommand;
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"into column-path"
}
fn signature(&self) -> Signature {
Signature::build("into column-path").rest(
"rest",
SyntaxShape::ColumnPath,
"values to convert to column path",
)
}
fn usage(&self) -> &str {
"Convert value to column path"
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
into_filepath(args)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Convert string to column path in table",
example: "echo [[name]; ['/dev/null'] ['C:\\Program Files'] ['../../Cargo.toml']] | into column-path name",
result: Some(vec![
UntaggedValue::row(indexmap! {
"name".to_string() => UntaggedValue::column_path("/dev/null", Span::unknown()).into(),
})
.into(),
UntaggedValue::row(indexmap! {
"name".to_string() => UntaggedValue::column_path("C:\\Program Files", Span::unknown()).into(),
})
.into(),
UntaggedValue::row(indexmap! {
"name".to_string() => UntaggedValue::column_path("../../Cargo.toml", Span::unknown()).into(),
})
.into(),
]),
},
Example {
description: "Convert string to column path",
example: "echo 'Cargo.toml' | into column-path",
result: Some(vec![UntaggedValue::column_path("Cargo.toml", Span::unknown()).into()]),
},
]
}
}
fn into_filepath(args: CommandArgs) -> Result<OutputStream, ShellError> {
let column_paths: Vec<ColumnPath> = args.rest(0)?;
Ok(args
.input
.map(move |v| {
if column_paths.is_empty() {
action(&v, v.tag())
} else {
let mut ret = v;
for path in &column_paths {
ret = ret.swap_data_by_column_path(
path,
Box::new(move |old| action(old, old.tag())),
)?;
}
Ok(ret)
}
})
.into_input_stream())
}
pub fn action(input: &Value, tag: impl Into<Tag>) -> Result<Value, ShellError> {
let tag = tag.into();
match &input.value {
UntaggedValue::Primitive(prim) => Ok(UntaggedValue::column_path(
match prim {
Primitive::String(a_string) => a_string,
_ => {
return Err(ShellError::unimplemented(
"'into column-path' for non-string primitives",
))
}
},
Span::unknown(),
)
.into_value(&tag)),
UntaggedValue::Row(_) => Err(ShellError::labeled_error(
"specify column name to use, with 'into column-path COLUMN'",
"found table",
tag,
)),
_ => Err(ShellError::unimplemented(
"'into column-path' for unsupported type",
)),
}
}
#[cfg(test)]
mod tests {
use super::ShellError;
use super::SubCommand;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(SubCommand {})
}
}

View File

@ -1,4 +1,5 @@
mod binary;
mod column_path;
mod command;
mod filepath;
mod filesize;
@ -7,6 +8,7 @@ pub mod string;
pub use self::filesize::SubCommand as IntoFilesize;
pub use binary::SubCommand as IntoBinary;
pub use column_path::SubCommand as IntoColumnPath;
pub use command::Command as Into;
pub use filepath::SubCommand as IntoFilepath;
pub use int::SubCommand as IntoInt;

View File

@ -101,17 +101,14 @@ fn if_command(args: CommandArgs) -> Result<OutputStream, ShellError> {
//FIXME: should we use the scope that's brought in as well?
let condition = evaluate_baseline_expr(cond, &context);
match condition {
let result = match condition {
Ok(condition) => match condition.as_bool() {
Ok(b) => {
let result = if b {
if b {
run_block(&then_case.block, &context, input, external_redirection)
} else {
run_block(&else_case.block, &context, input, external_redirection)
};
context.scope.exit_scope();
result
}
}
Err(e) => Ok(OutputStream::from_stream(
vec![UntaggedValue::Error(e).into_untagged_value()].into_iter(),
@ -120,13 +117,16 @@ fn if_command(args: CommandArgs) -> Result<OutputStream, ShellError> {
Err(e) => Ok(OutputStream::from_stream(
vec![UntaggedValue::Error(e).into_untagged_value()].into_iter(),
)),
}
};
context.scope.exit_scope();
result
}
#[cfg(test)]
mod tests {
use super::If;
use super::ShellError;
use nu_test_support::nu;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
@ -134,4 +134,21 @@ mod tests {
test_examples(If {})
}
#[test]
fn if_doesnt_leak_on_error() {
let actual = nu!(
".",
r#"
def test-leak [] {
let var = "hello"
if 0 == "" {echo ok} {echo not}
}
test-leak
echo $var
"#
);
assert!(actual.err.contains("unknown variable"));
}
}

View File

@ -15,6 +15,7 @@ impl WholeStreamCommand for Command {
.switch("skip-plugins", "do not load plugins", None)
.switch("no-history", "don't save history", None)
.switch("perf", "show startup performance metrics", None)
.switch("login", "start Nu as if it was a login shell", Some('l'))
.named(
"commands",
SyntaxShape::String,
@ -33,7 +34,7 @@ impl WholeStreamCommand for Command {
"loglevel",
SyntaxShape::String,
"LEVEL: error, warn, info, debug, trace",
Some('l'),
None,
)
.named(
"config-file",

View File

@ -69,13 +69,11 @@ pub fn source(args: CommandArgs) -> Result<OutputStream, ShellError> {
for lib_path in dir {
match lib_path {
Ok(name) => {
let path = canonicalize_with(&source_file, name).map_err(|e| {
ShellError::labeled_error(
format!("Can't load source file. Reason: {}", e.to_string()),
"Can't load this file",
filename.span(),
)
})?;
let path = if let Ok(p) = canonicalize_with(&source_file, name) {
p
} else {
continue;
};
if let Ok(contents) = std::fs::read_to_string(path) {
let result = script::run_script_standalone(contents, true, ctx, false);
@ -95,7 +93,7 @@ pub fn source(args: CommandArgs) -> Result<OutputStream, ShellError> {
let path = canonicalize(source_file).map_err(|e| {
ShellError::labeled_error(
format!("Can't load source file. Reason: {}", e.to_string()),
format!("Can't load source file. Reason: {}", e),
"Can't load this file",
filename.span(),
)
@ -114,7 +112,7 @@ pub fn source(args: CommandArgs) -> Result<OutputStream, ShellError> {
}
Err(e) => {
ctx.error(ShellError::labeled_error(
format!("Can't load source file. Reason: {}", e.to_string()),
format!("Can't load source file. Reason: {}", e),
"Can't load this file",
filename.span(),
));

View File

@ -81,6 +81,7 @@ fn tutor(args: CommandArgs) -> Result<OutputStream, ShellError> {
vec!["var", "vars", "variable", "variables"],
variable_tutor(),
),
(vec!["engine-q", "e-q"], engineq_tutor()),
(vec!["block", "blocks"], block_tutor()),
(vec!["shorthand", "shorthands"], shorthand_tutor()),
];
@ -166,7 +167,7 @@ This will get the 3rd (note that `nth` is zero-based) row in the table created
by the `ls` command. You can use `nth` on any table created by other commands
as well.
You can also access the column of data in one of two ways. If you want to want
You can also access the column of data in one of two ways. If you want
to keep the column as part of a new table, you can use `select`.
```
ls | select name
@ -274,7 +275,7 @@ This can be helpful if you want to later processes these values.
The `echo` command can pair well with the `each` command which can run
code on each row, or item, of input.
You can continue to learn more about the `echo` command by running:
You can continue to learn more about the `each` command by running:
```
tutor each
```
@ -370,6 +371,29 @@ same value using:
"#
}
fn engineq_tutor() -> &'static str {
r#"
Engine-q is the upcoming engine for Nushell. Build for speed and correctness,
it also comes with a set of changes from Nushell versions prior to 0.60. To
get ready for engine-q look for some of these changes that might impact your
current scripts:
* Engine-q now uses a few new data structures, including a record syntax
that allows you to model key-value pairs similar to JSON objects.
* Environment variables can now contain more than just strings. Structured
values are converted to strings for external commands using converters.
* `if` will now use an `else` keyword before the else block.
* We're moving from "config.toml" to "config.nu". This means startup will
now be a script file.
* `config` and its subcommands are being replaced by a record that you can
update in the shell which contains all the settings under the variable
`$config`.
* bigint/bigdecimal values are now machine i64 and f64 values
* And more, you can read more about upcoming changes in the up-to-date list
at: https://github.com/nushell/engine-q/issues/522
"#
}
fn display(tag: Tag, scope: &Scope, help: &str) -> OutputStream {
let help = help.split('`');

View File

@ -221,11 +221,6 @@ fn features_enabled() -> Vec<String> {
names.push("zip".to_string());
}
#[cfg(feature = "clipboard-cli")]
{
names.push("clipboard-cli".to_string());
}
#[cfg(feature = "trash-support")]
{
names.push("trash".to_string());

View File

@ -121,7 +121,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let tail = df.as_ref().get_columns().iter().map(|col| {
let count = col.len() as f64;
let sum = match col.sum_as_series().cast_with_dtype(&DataType::Float64) {
let sum = match col.sum_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -144,7 +144,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
_ => None,
};
let min = match col.min_as_series().cast_with_dtype(&DataType::Float64) {
let min = match col.min_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -153,7 +153,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
};
let q_25 = match col.quantile_as_series(0.25) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) {
Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -164,7 +164,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
};
let q_50 = match col.quantile_as_series(0.50) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) {
Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -175,7 +175,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
};
let q_75 = match col.quantile_as_series(0.75) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) {
Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -185,7 +185,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
Err(_) => None,
};
let max = match col.max_as_series().cast_with_dtype(&DataType::Float64) {
let max = match col.max_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,

View File

@ -44,6 +44,12 @@ impl WholeStreamCommand for DataFrame {
"type of join. Inner by default",
Some('t'),
)
.named(
"suffix",
SyntaxShape::String,
"suffix for the columns of the right dataframe",
Some('s'),
)
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
@ -104,6 +110,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let r_df: Value = args.req(0)?;
let l_col: Vec<Value> = args.req_named("left")?;
let r_col: Vec<Value> = args.req_named("right")?;
let r_suffix: Option<Tagged<String>> = args.get_flag("suffix")?;
let join_type_op: Option<Tagged<String>> = args.get_flag("type")?;
let join_type = match join_type_op {
@ -124,6 +131,8 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
},
};
let suffix = r_suffix.map(|s| s.item);
let (l_col_string, l_col_span) = convert_columns(&l_col, &tag)?;
let (r_col_string, r_col_span) = convert_columns(&r_col, &tag)?;
@ -142,7 +151,13 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
)?;
df.as_ref()
.join(r_df.as_ref(), &l_col_string, &r_col_string, join_type)
.join(
r_df.as_ref(),
&l_col_string,
&r_col_string,
join_type,
suffix,
)
.map_err(|e| parse_polars_error::<&str>(&e, &l_col_span, None))
}
_ => Err(ShellError::labeled_error(

View File

@ -8,7 +8,7 @@ use nu_protocol::{
};
use nu_source::Tagged;
use polars::prelude::{CsvEncoding, CsvReader, JsonReader, ParquetReader, PolarsError, SerReader};
use polars::prelude::{CsvEncoding, CsvReader, JsonReader, ParquetReader, SerReader};
use std::fs::File;
pub struct DataFrame;
@ -206,15 +206,6 @@ fn from_csv(args: CommandArgs) -> Result<polars::prelude::DataFrame, ShellError>
match csv_reader.finish() {
Ok(df) => Ok(df),
Err(e) => match e {
PolarsError::Other(_) => Err(ShellError::labeled_error_with_secondary(
"Schema error",
"Error with the inferred schema",
&file.tag.span,
"You can use the argument 'infer_schema' with a number of rows large enough to better infer the schema",
&file.tag.span,
)),
_ => Err(parse_polars_error::<&str>(&e, &file.tag.span, None)),
},
Err(e) => Err(parse_polars_error::<&str>(&e, &file.tag.span, None)),
}
}

View File

@ -101,9 +101,9 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let cum_type = CumType::from_str(&cum_type.item, &cum_type.tag.span)?;
let mut res = match cum_type {
CumType::Max => series.cum_max(reverse),
CumType::Min => series.cum_min(reverse),
CumType::Sum => series.cum_sum(reverse),
CumType::Max => series.cummax(reverse),
CumType::Min => series.cummin(reverse),
CumType::Sum => series.cumsum(reverse),
};
let name = format!("{}_{}", series.name(), cum_type.to_str());

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.day().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.hour().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.minute().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.month().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.nanosecond().into_series();

View File

@ -56,7 +56,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.ordinal().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.second().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.week().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.weekday().into_series();

View File

@ -56,7 +56,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.year().into_series();

View File

@ -6,7 +6,7 @@ use nu_protocol::{
Signature, SyntaxShape, UntaggedValue,
};
use nu_source::Tagged;
use polars::prelude::DataType;
use polars::prelude::{DataType, RollingOptions};
enum RollType {
Min,
@ -57,7 +57,6 @@ impl WholeStreamCommand for DataFrame {
Signature::build("dataframe rolling")
.required("type", SyntaxShape::String, "rolling operation")
.required("window", SyntaxShape::Int, "Window size for rolling")
.switch("ignore_nulls", "Ignore nulls in column", Some('i'))
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
@ -112,7 +111,6 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let roll_type: Tagged<String> = args.req(0)?;
let window_size: Tagged<i64> = args.req(1)?;
let ignore_nulls = args.has_flag("ignore_nulls");
let (df, df_tag) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?;
let series = df.as_series(&df_tag.span)?;
@ -126,31 +124,17 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
}
let roll_type = RollType::from_str(&roll_type.item, &roll_type.tag.span)?;
let rolling_opts = RollingOptions {
window_size: window_size.item as usize,
min_periods: window_size.item as usize,
weights: None,
center: false,
};
let res = match roll_type {
RollType::Max => series.rolling_max(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Min => series.rolling_min(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Sum => series.rolling_sum(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Mean => series.rolling_mean(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Max => series.rolling_max(rolling_opts),
RollType::Min => series.rolling_min(rolling_opts),
RollType::Sum => series.rolling_sum(rolling_opts),
RollType::Mean => series.rolling_mean(rolling_opts),
};
let mut res = res.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;

View File

@ -78,7 +78,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let casted = match indices.dtype() {
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => indices
.as_ref()
.cast_with_dtype(&DataType::UInt32)
.cast(&DataType::UInt32)
.map_err(|e| parse_polars_error::<&str>(&e, &value.tag.span, None)),
_ => Err(ShellError::labeled_error_with_secondary(
"Incorrect type",

View File

@ -58,7 +58,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.strftime(&fmt.item).into_series();

View File

@ -92,7 +92,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let casted = match series.dtype() {
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => series
.as_ref()
.cast_with_dtype(&DataType::UInt32)
.cast(&DataType::UInt32)
.map_err(|e| parse_polars_error::<&str>(&e, &value.tag.span, None)),
_ => Err(ShellError::labeled_error_with_secondary(
"Incorrect type",

View File

@ -73,9 +73,9 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let writer = CsvWriter::new(&mut file);
let writer = if no_header {
writer.has_headers(false)
writer.has_header(false)
} else {
writer.has_headers(true)
writer.has_header(true)
};
let writer = match delimiter {

View File

@ -53,13 +53,12 @@ pub(crate) fn parse_polars_error<T: AsRef<str>>(
PolarsError::DataTypeMisMatch(_) => "Data Type Mismatch",
PolarsError::NotFound(_) => "Not Found",
PolarsError::ShapeMisMatch(_) => "Shape Mismatch",
PolarsError::Other(_) => "Other",
PolarsError::ComputeError(_) => "Computer error",
PolarsError::OutOfBounds(_) => "Out Of Bounds",
PolarsError::NoSlice => "No Slice",
PolarsError::NoData(_) => "No Data",
PolarsError::ValueError(_) => "Value Error",
PolarsError::MemoryNotAligned => "Memory Not Aligned",
PolarsError::ParquetError(_) => "Parquet Error",
PolarsError::RandError(_) => "Rand Error",
PolarsError::HasNullValues(_) => "Has Null Values",
PolarsError::UnknownSchema(_) => "Unknown Schema",

View File

@ -11,12 +11,6 @@ use nu_protocol::{
pub struct WithEnv;
#[derive(Deserialize, Debug)]
struct WithEnvArgs {
variable: Value,
block: CapturedBlock,
}
impl WholeStreamCommand for WithEnv {
fn name(&self) -> &str {
"with-env"

View File

@ -197,7 +197,7 @@ fn process_row(
} else {
let mut obj = input.clone();
for column in column_paths.clone() {
for column in column_paths {
let path = UntaggedValue::Primitive(Primitive::ColumnPath(column.clone()))
.into_value(tag);
let data = r.get_data(&as_string(&path)?).borrow().clone();

View File

@ -36,6 +36,7 @@ mod skip;
pub(crate) mod sort_by;
mod uniq;
mod update;
mod update_cells;
mod where_;
mod wrap;
mod zip_;
@ -78,6 +79,7 @@ pub use skip::{Skip, SkipUntil, SkipWhile};
pub use sort_by::SortBy;
pub use uniq::Uniq;
pub use update::Command as Update;
pub use update_cells::SubCommand as UpdateCells;
pub use where_::Command as Where;
pub use wrap::Wrap;
pub use zip_::Command as Zip;

View File

@ -15,11 +15,18 @@ impl WholeStreamCommand for Command {
}
fn signature(&self) -> Signature {
Signature::build("select").rest(
"rest",
SyntaxShape::ColumnPath,
"the columns to select from the table",
)
Signature::build("select")
.named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
.rest(
"rest",
SyntaxShape::ColumnPath,
"the columns to select from the table",
)
}
fn usage(&self) -> &str {
@ -27,10 +34,10 @@ impl WholeStreamCommand for Command {
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let columns: Vec<ColumnPath> = args.rest(0)?;
let mut columns = args.rest(0)?;
columns.extend(column_paths_from_args(&args)?);
let input = args.input;
let name = args.call_info.name_tag;
select(name, columns, input)
}
@ -46,10 +53,51 @@ impl WholeStreamCommand for Command {
example: "ls | select name size",
result: None,
},
Example {
description: "Select columns dynamically",
example: "[[a b]; [1 2]] | select -c [a]",
result: Some(vec![UntaggedValue::row(indexmap! {
"a".to_string() => UntaggedValue::int(1).into(),
})
.into()]),
},
]
}
}
fn column_paths_from_args(args: &CommandArgs) -> Result<Vec<ColumnPath>, ShellError> {
let column_paths: Option<Vec<Value>> = args.get_flag("columns")?;
let has_columns = column_paths.is_some();
let column_paths = match column_paths {
Some(cols) => {
let mut c = Vec::new();
for col in cols {
let colpath = ColumnPath::build(&col.convert_to_string().spanned_unknown());
if !colpath.is_empty() {
c.push(colpath)
}
}
c
}
None => Vec::new(),
};
if has_columns && column_paths.is_empty() {
let colval: Option<Value> = args.get_flag("columns")?;
let colspan = match colval {
Some(v) => v.tag.span,
None => Span::unknown(),
};
return Err(ShellError::labeled_error(
"Requires a list of columns",
"must be a list of columns",
colspan,
));
}
Ok(column_paths)
}
fn select(
name: Tag,
columns: Vec<ColumnPath>,

View File

@ -0,0 +1,211 @@
use crate::prelude::*;
use nu_engine::run_block;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{
hir::{CapturedBlock, ExternalRedirection},
Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue, Value,
};
use std::collections::HashSet;
use std::iter::FromIterator;
pub struct SubCommand;
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"update cells"
}
fn signature(&self) -> Signature {
Signature::build("update cells")
.required(
"block",
SyntaxShape::Block,
"the block to run an update for each cell",
)
.named(
"columns",
SyntaxShape::Table,
"list of columns to update",
Some('c'),
)
}
fn usage(&self) -> &str {
"Update the table cells."
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
update_cells(args)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Update the zero value cells to empty strings.",
example: r#"[
[2021-04-16, 2021-06-10, 2021-09-18, 2021-10-15, 2021-11-16, 2021-11-17, 2021-11-18];
[ 37, 0, 0, 0, 37, 0, 0]
] | update cells {|value|
if ($value | into int) == 0 {
""
} {
$value
}
}"#,
result: Some(vec![UntaggedValue::row(indexmap! {
"2021-04-16".to_string() => UntaggedValue::int(37).into(),
"2021-06-10".to_string() => Value::from(""),
"2021-09-18".to_string() => Value::from(""),
"2021-10-15".to_string() => Value::from(""),
"2021-11-16".to_string() => UntaggedValue::int(37).into(),
"2021-11-17".to_string() => Value::from(""),
"2021-11-18".to_string() => Value::from(""),
})
.into()]),
},
Example {
description: "Update the zero value cells to empty strings in 2 last columns.",
example: r#"[
[2021-04-16, 2021-06-10, 2021-09-18, 2021-10-15, 2021-11-16, 2021-11-17, 2021-11-18];
[ 37, 0, 0, 0, 37, 0, 0]
] | update cells -c ["2021-11-18", "2021-11-17"] {|value|
if ($value | into int) == 0 {
""
} {
$value
}
}"#,
result: Some(vec![UntaggedValue::row(indexmap! {
"2021-04-16".to_string() => UntaggedValue::int(37).into(),
"2021-06-10".to_string() => UntaggedValue::int(0).into(),
"2021-09-18".to_string() => UntaggedValue::int(0).into(),
"2021-10-15".to_string() => UntaggedValue::int(0).into(),
"2021-11-16".to_string() => UntaggedValue::int(37).into(),
"2021-11-17".to_string() => Value::from(""),
"2021-11-18".to_string() => Value::from(""),
})
.into()]),
},
]
}
}
fn update_cells(args: CommandArgs) -> Result<OutputStream, ShellError> {
let context = Arc::new(args.context.clone());
let external_redirection = args.call_info.args.external_redirection;
let block: CapturedBlock = args.req(0)?;
let block = Arc::new(block);
let columns = args
.get_flag("columns")?
.map(|x: Value| HashSet::from_iter(x.table_entries().map(|val| val.convert_to_string())));
let columns = Arc::new(columns);
Ok(args
.input
.flat_map(move |input| {
let block = block.clone();
let context = context.clone();
if input.is_row() {
OutputStream::one(process_cells(
block,
columns.clone(),
context,
input,
external_redirection,
))
} else {
match process_input(block, context, input, external_redirection) {
Ok(s) => s,
Err(e) => OutputStream::one(Value::error(e)),
}
}
})
.into_output_stream())
}
pub fn process_input(
captured_block: Arc<CapturedBlock>,
context: Arc<EvaluationContext>,
input: Value,
external_redirection: ExternalRedirection,
) -> Result<OutputStream, ShellError> {
let input_clone = input.clone();
// When we process a row, we need to know whether the block wants to have the contents of the row as
// a parameter to the block (so it gets assigned to a variable that can be used inside the block) or
// if it wants the contents as as an input stream
let input_stream = if !captured_block.block.params.positional.is_empty() {
InputStream::empty()
} else {
vec![Ok(input_clone)].into_iter().into_input_stream()
};
context.scope.enter_scope();
context.scope.add_vars(&captured_block.captured.entries);
if let Some((arg, _)) = captured_block.block.params.positional.first() {
context.scope.add_var(arg.name(), input);
} else {
context.scope.add_var("$it", input);
}
let result = run_block(
&captured_block.block,
&context,
input_stream,
external_redirection,
);
context.scope.exit_scope();
result
}
pub fn process_cells(
captured_block: Arc<CapturedBlock>,
columns: Arc<Option<HashSet<String>>>,
context: Arc<EvaluationContext>,
input: Value,
external_redirection: ExternalRedirection,
) -> Value {
TaggedDictBuilder::build(input.tag(), |row| {
input.row_entries().for_each(|(column, cell_value)| {
match &*columns {
Some(col) if !col.contains(column) => {
row.insert_value(column, cell_value.clone());
return;
}
_ => {}
};
let cell_processed = process_input(
captured_block.clone(),
context.clone(),
cell_value.clone(),
external_redirection,
)
.map(|it| it.into_vec())
.map_err(Value::error);
match cell_processed {
Ok(value) => {
match value.get(0) {
Some(one) => {
row.insert_value(column, one.clone());
}
None => {
row.insert_untagged(column, UntaggedValue::nothing());
}
};
}
Err(reason) => {
row.insert_value(column, reason);
}
}
});
})
}

View File

@ -136,7 +136,8 @@ fn to_string_tagged_value(v: &Value) -> Result<String, ShellError> {
| Primitive::Boolean(_)
| Primitive::Decimal(_)
| Primitive::FilePath(_)
| Primitive::Int(_),
| Primitive::Int(_)
| Primitive::BigInt(_),
) => as_string(v),
UntaggedValue::Primitive(Primitive::Date(d)) => Ok(d.to_string()),
UntaggedValue::Primitive(Primitive::Nothing) => Ok(String::new()),

View File

@ -163,7 +163,7 @@ fn get_current_date() -> (i32, u32, u32) {
fn add_months_of_year_to_table(
args: &CommandArgs,
mut calendar_vec_deque: &mut VecDeque<Value>,
calendar_vec_deque: &mut VecDeque<Value>,
tag: &Tag,
selected_year: i32,
(start_month, end_month): (u32, u32),
@ -181,7 +181,7 @@ fn add_months_of_year_to_table(
let add_month_to_table_result = add_month_to_table(
args,
&mut calendar_vec_deque,
calendar_vec_deque,
tag,
selected_year,
month_number,

View File

@ -112,7 +112,11 @@ mod tests {
fn only_examples() -> Vec<Command> {
let mut commands = full_tests();
commands.extend([whole_stream_command(Zip), whole_stream_command(Flatten)]);
commands.extend([
whole_stream_command(UpdateCells),
whole_stream_command(Zip),
whole_stream_command(Flatten),
]);
commands
}

View File

@ -36,6 +36,7 @@ impl WholeStreamCommand for Command {
Some('p'),
)
.switch("raw", "fetch contents as text rather than a table", Some('r'))
.switch("insecure", "allow insecure server connections when using SSL", Some('k'))
.filter()
}
@ -78,6 +79,7 @@ fn run_fetch(args: CommandArgs) -> Result<ActionStream, ShellError> {
)
})?,
fetch_helper.has_raw,
fetch_helper.has_insecure,
fetch_helper.user.clone(),
fetch_helper.password,
))]
@ -92,6 +94,7 @@ pub struct Fetch {
pub path: Option<Value>,
pub tag: Tag,
pub has_raw: bool,
pub has_insecure: bool,
pub user: Option<String>,
pub password: Option<String>,
}
@ -102,6 +105,7 @@ impl Fetch {
path: None,
tag: Tag::unknown(),
has_raw: false,
has_insecure: false,
user: None,
password: None,
}
@ -121,6 +125,8 @@ impl Fetch {
self.has_raw = args.has_flag("raw");
self.has_insecure = args.has_flag("insecure");
self.user = args.get_flag("user")?;
self.password = args.get_flag("password")?;
@ -132,13 +138,14 @@ impl Fetch {
pub async fn fetch(
path: &Value,
has_raw: bool,
has_insecure: bool,
user: Option<String>,
password: Option<String>,
) -> ReturnValue {
let path_str = path.as_string()?;
let path_span = path.tag.span;
let result = helper(&path_str, path_span, has_raw, user, password).await;
let result = helper(&path_str, path_span, has_raw, has_insecure, user, password).await;
if let Err(e) = result {
return Err(e);
@ -168,6 +175,7 @@ async fn helper(
location: &str,
span: Span,
has_raw: bool,
has_insecure: bool,
user: Option<String>,
password: Option<String>,
) -> std::result::Result<(Option<String>, Value), ShellError> {
@ -188,7 +196,7 @@ async fn helper(
_ => None,
};
let client = http_client();
let client = http_client(has_insecure);
let mut request = client.get(url);
if let Some(login) = login {
@ -360,10 +368,10 @@ async fn helper(
// Only panics if the user agent is invalid but we define it statically so either
// it always or never fails
#[allow(clippy::unwrap_used)]
fn http_client() -> reqwest::Client {
fn http_client(allow_insecure: bool) -> reqwest::Client {
reqwest::Client::builder()
.user_agent("nushell")
.danger_accept_invalid_certs(allow_insecure)
.build()
.unwrap()
.expect("Failed to build reqwest client")
}

View File

@ -53,6 +53,11 @@ impl WholeStreamCommand for Command {
"return values as a string instead of a table",
Some('r'),
)
.switch(
"insecure",
"allow insecure server connections when using SSL",
Some('k'),
)
.filter()
}
@ -91,6 +96,7 @@ fn run_post(args: CommandArgs) -> Result<ActionStream, ShellError> {
ShellError::labeled_error("expected a 'path'", "expected a 'path'", &helper.tag)
})?,
helper.has_raw,
helper.has_insecure,
&helper.body.clone().ok_or_else(|| {
ShellError::labeled_error("expected a 'body'", "expected a 'body'", &helper.tag)
})?,
@ -114,6 +120,7 @@ pub enum HeaderKind {
pub struct Post {
pub path: Option<Value>,
pub has_raw: bool,
pub has_insecure: bool,
pub body: Option<Value>,
pub user: Option<String>,
pub password: Option<String>,
@ -126,6 +133,7 @@ impl Post {
Post {
path: None,
has_raw: false,
has_insecure: false,
body: None,
user: None,
password: None,
@ -156,6 +164,8 @@ impl Post {
self.has_raw = args.has_flag("raw");
self.has_insecure = args.has_flag("insecure");
self.user = args.get_flag("user")?;
self.password = args.get_flag("password")?;
@ -169,6 +179,7 @@ impl Post {
pub async fn post_helper(
path: &Value,
has_raw: bool,
has_insecure: bool,
body: &Value,
user: Option<String>,
password: Option<String>,
@ -177,8 +188,16 @@ pub async fn post_helper(
let path_tag = path.tag.clone();
let path_str = path.as_string()?;
let (file_extension, contents, contents_tag) =
post(&path_str, body, user, password, headers, path_tag.clone()).await?;
let (file_extension, contents, contents_tag) = post(
&path_str,
has_insecure,
body,
user,
password,
headers,
path_tag.clone(),
)
.await?;
let file_extension = if has_raw {
None
@ -202,6 +221,7 @@ pub async fn post_helper(
pub async fn post(
location: &str,
allow_insecure: bool,
body: &Value,
user: Option<String>,
password: Option<String>,
@ -219,7 +239,9 @@ pub async fn post(
value: UntaggedValue::Primitive(Primitive::String(body_str)),
..
} => {
let mut s = http_client().post(location).body(body_str.to_string());
let mut s = http_client(allow_insecure)
.post(location)
.body(body_str.to_string());
if let Some(login) = login {
s = s.header("Authorization", format!("Basic {}", login));
}
@ -237,7 +259,9 @@ pub async fn post(
value: UntaggedValue::Primitive(Primitive::Binary(b)),
..
} => {
let mut s = http_client().post(location).body(Vec::from(&b[..]));
let mut s = http_client(allow_insecure)
.post(location)
.body(Vec::from(&b[..]));
if let Some(login) = login {
s = s.header("Authorization", format!("Basic {}", login));
}
@ -247,7 +271,9 @@ pub async fn post(
match value_to_json_value(&value.clone().into_untagged_value()) {
Ok(json_value) => match serde_json::to_string(&json_value) {
Ok(result_string) => {
let mut s = http_client().post(location).body(result_string);
let mut s = http_client(allow_insecure)
.post(location)
.body(result_string);
if let Some(login) = login {
s = s.header("Authorization", format!("Basic {}", login));
@ -611,10 +637,10 @@ fn extract_header_value(args: &CommandArgs, key: &str) -> Result<Option<String>,
// Only panics if the user agent is invalid but we define it statically so either
// it always or never fails
#[allow(clippy::unwrap_used)]
fn http_client() -> reqwest::Client {
fn http_client(allow_insecure: bool) -> reqwest::Client {
reqwest::Client::builder()
.user_agent("nushell")
.danger_accept_invalid_certs(allow_insecure)
.build()
.unwrap()
.expect("Failed to build reqwest client")
}

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments};
use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
@ -9,13 +9,13 @@ use std::path::Path;
pub struct PathBasename;
struct PathBasenameArguments {
rest: Vec<ColumnPath>,
columns: Vec<ColumnPath>,
replace: Option<Tagged<String>>,
}
impl PathSubcommandArguments for PathBasenameArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest
&self.columns
}
}
@ -26,10 +26,11 @@ impl WholeStreamCommand for PathBasename {
fn signature(&self) -> Signature {
Signature::build("path basename")
.rest(
"rest",
SyntaxShape::ColumnPath,
.named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
.named(
"replace",
@ -46,7 +47,7 @@ impl WholeStreamCommand for PathBasename {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathBasenameArguments {
rest: args.rest(0)?,
columns: column_paths_from_args(&args)?,
replace: args.get_flag("replace")?,
});
@ -58,12 +59,17 @@ impl WholeStreamCommand for PathBasename {
vec![
Example {
description: "Get basename of a path",
example: "echo 'C:\\Users\\joe\\test.txt' | path basename",
example: "'C:\\Users\\joe\\test.txt' | path basename",
result: Some(vec![Value::from("test.txt")]),
},
Example {
description: "Get basename of a path in a column",
example: "ls .. | path basename -c [ name ]",
result: None,
},
Example {
description: "Replace basename of a path",
example: "echo 'C:\\Users\\joe\\test.txt' | path basename -r 'spam.png'",
example: "'C:\\Users\\joe\\test.txt' | path basename -r 'spam.png'",
result: Some(vec![Value::from(UntaggedValue::filepath(
"C:\\Users\\joe\\spam.png",
))]),
@ -76,12 +82,17 @@ impl WholeStreamCommand for PathBasename {
vec![
Example {
description: "Get basename of a path",
example: "echo '/home/joe/test.txt' | path basename",
example: "'/home/joe/test.txt' | path basename",
result: Some(vec![Value::from("test.txt")]),
},
Example {
description: "Get basename of a path in a column",
example: "ls .. | path basename -c [ name ]",
result: None,
},
Example {
description: "Replace basename of a path",
example: "echo '/home/joe/test.txt' | path basename -r 'spam.png'",
example: "'/home/joe/test.txt' | path basename -r 'spam.png'",
result: Some(vec![Value::from(UntaggedValue::filepath(
"/home/joe/spam.png",
))]),

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments};
use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
@ -9,14 +9,14 @@ use std::path::Path;
pub struct PathDirname;
struct PathDirnameArguments {
rest: Vec<ColumnPath>,
columns: Vec<ColumnPath>,
replace: Option<Tagged<String>>,
num_levels: Option<Tagged<u32>>,
}
impl PathSubcommandArguments for PathDirnameArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest
&self.columns
}
}
@ -27,10 +27,11 @@ impl WholeStreamCommand for PathDirname {
fn signature(&self) -> Signature {
Signature::build("path dirname")
.rest(
"rest",
SyntaxShape::ColumnPath,
.named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
.named(
"replace",
@ -53,7 +54,7 @@ impl WholeStreamCommand for PathDirname {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathDirnameArguments {
rest: args.rest(0)?,
columns: column_paths_from_args(&args)?,
replace: args.get_flag("replace")?,
num_levels: args.get_flag("num-levels")?,
});
@ -66,20 +67,25 @@ impl WholeStreamCommand for PathDirname {
vec![
Example {
description: "Get dirname of a path",
example: "echo 'C:\\Users\\joe\\code\\test.txt' | path dirname",
example: "'C:\\Users\\joe\\code\\test.txt' | path dirname",
result: Some(vec![Value::from(UntaggedValue::filepath(
"C:\\Users\\joe\\code",
))]),
},
Example {
description: "Get dirname of a path in a column",
example: "ls ('.' | path expand) | path dirname -c [ name ]",
result: None,
},
Example {
description: "Walk up two levels",
example: "echo 'C:\\Users\\joe\\code\\test.txt' | path dirname -n 2",
example: "'C:\\Users\\joe\\code\\test.txt' | path dirname -n 2",
result: Some(vec![Value::from(UntaggedValue::filepath("C:\\Users\\joe"))]),
},
Example {
description: "Replace the part that would be returned with a custom path",
example:
"echo 'C:\\Users\\joe\\code\\test.txt' | path dirname -n 2 -r C:\\Users\\viking",
"'C:\\Users\\joe\\code\\test.txt' | path dirname -n 2 -r C:\\Users\\viking",
result: Some(vec![Value::from(UntaggedValue::filepath(
"C:\\Users\\viking\\code\\test.txt",
))]),
@ -92,17 +98,22 @@ impl WholeStreamCommand for PathDirname {
vec![
Example {
description: "Get dirname of a path",
example: "echo '/home/joe/code/test.txt' | path dirname",
example: "'/home/joe/code/test.txt' | path dirname",
result: Some(vec![Value::from(UntaggedValue::filepath("/home/joe/code"))]),
},
Example {
description: "Get dirname of a path in a column",
example: "ls ('.' | path expand) | path dirname -c [ name ]",
result: None,
},
Example {
description: "Walk up two levels",
example: "echo '/home/joe/code/test.txt' | path dirname -n 2",
example: "'/home/joe/code/test.txt' | path dirname -n 2",
result: Some(vec![Value::from(UntaggedValue::filepath("/home/joe"))]),
},
Example {
description: "Replace the part that would be returned with a custom path",
example: "echo '/home/joe/code/test.txt' | path dirname -n 2 -r /home/viking",
example: "'/home/joe/code/test.txt' | path dirname -n 2 -r /home/viking",
result: Some(vec![Value::from(UntaggedValue::filepath(
"/home/viking/code/test.txt",
))]),

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments};
use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
@ -8,12 +8,12 @@ use std::path::Path;
pub struct PathExists;
struct PathExistsArguments {
rest: Vec<ColumnPath>,
columns: Vec<ColumnPath>,
}
impl PathSubcommandArguments for PathExistsArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest
&self.columns
}
}
@ -23,10 +23,11 @@ impl WholeStreamCommand for PathExists {
}
fn signature(&self) -> Signature {
Signature::build("path exists").rest(
"rest",
SyntaxShape::ColumnPath,
Signature::build("path exists").named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
}
@ -37,7 +38,7 @@ impl WholeStreamCommand for PathExists {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathExistsArguments {
rest: args.rest(0)?,
columns: column_paths_from_args(&args)?,
});
Ok(operate(args.input, &action, tag.span, cmd_args))
@ -45,20 +46,34 @@ impl WholeStreamCommand for PathExists {
#[cfg(windows)]
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Check if a file exists",
example: "echo 'C:\\Users\\joe\\todo.txt' | path exists",
result: Some(vec![Value::from(UntaggedValue::boolean(false))]),
}]
vec![
Example {
description: "Check if a file exists",
example: "'C:\\Users\\joe\\todo.txt' | path exists",
result: Some(vec![Value::from(UntaggedValue::boolean(false))]),
},
Example {
description: "Check if a file exists in a column",
example: "ls | path exists -c [ name ]",
result: None,
},
]
}
#[cfg(not(windows))]
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Check if a file exists",
example: "echo '/home/joe/todo.txt' | path exists",
result: Some(vec![Value::from(UntaggedValue::boolean(false))]),
}]
vec![
Example {
description: "Check if a file exists",
example: "'/home/joe/todo.txt' | path exists",
result: Some(vec![Value::from(UntaggedValue::boolean(false))]),
},
Example {
description: "Check if a file exists in a column",
example: "ls | path exists -c [ name ]",
result: None,
},
]
}
}

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments};
use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
@ -11,12 +11,12 @@ pub struct PathExpand;
struct PathExpandArguments {
strict: bool,
rest: Vec<ColumnPath>,
columns: Vec<ColumnPath>,
}
impl PathSubcommandArguments for PathExpandArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest
&self.columns
}
}
@ -32,10 +32,11 @@ impl WholeStreamCommand for PathExpand {
"Throw an error if the path could not be expanded",
Some('s'),
)
.rest(
"rest",
SyntaxShape::ColumnPath,
.named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
}
@ -47,7 +48,7 @@ impl WholeStreamCommand for PathExpand {
let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathExpandArguments {
strict: args.has_flag("strict"),
rest: args.rest(0)?,
columns: column_paths_from_args(&args)?,
});
Ok(operate(args.input, &action, tag.span, cmd_args))
@ -63,6 +64,11 @@ impl WholeStreamCommand for PathExpand {
UntaggedValue::filepath(r"C:\Users\joe\bar").into_value(Span::new(0, 25))
]),
},
Example {
description: "Expand a path in a column",
example: "ls | path expand -c [ name ]",
result: None,
},
Example {
description: "Expand a relative path",
example: r"'foo\..\bar' | path expand",
@ -83,6 +89,11 @@ impl WholeStreamCommand for PathExpand {
UntaggedValue::filepath("/home/joe/bar").into_value(Span::new(0, 22))
]),
},
Example {
description: "Expand a path in a column",
example: "ls | path expand -c [ name ]",
result: None,
},
Example {
description: "Expand a relative path",
example: "'foo/../bar' | path expand",

View File

@ -1,4 +1,6 @@
use super::{handle_value, join_path, operate_column_paths, PathSubcommandArguments};
use super::{
column_paths_from_args, handle_value, join_path, operate_column_paths, PathSubcommandArguments,
};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
@ -9,13 +11,13 @@ use std::path::{Path, PathBuf};
pub struct PathJoin;
struct PathJoinArguments {
rest: Vec<ColumnPath>,
columns: Vec<ColumnPath>,
append: Option<Tagged<PathBuf>>,
}
impl PathSubcommandArguments for PathJoinArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest
&self.columns
}
}
@ -26,16 +28,16 @@ impl WholeStreamCommand for PathJoin {
fn signature(&self) -> Signature {
Signature::build("path join")
.rest(
"rest",
SyntaxShape::ColumnPath,
"Optionally operate by column path",
)
.named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
.optional(
"append",
SyntaxShape::FilePath,
"Path to append to the input",
Some('a'),
)
}
@ -50,9 +52,10 @@ the output of 'path parse' and 'path split' subcommands."#
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathJoinArguments {
rest: args.rest(0)?,
append: args.get_flag("append")?,
columns: column_paths_from_args(&args)?,
append: args.opt(0)?,
});
Ok(operate_join(args.input, &action, tag, cmd_args))
@ -63,21 +66,26 @@ the output of 'path parse' and 'path split' subcommands."#
vec![
Example {
description: "Append a filename to a path",
example: r"echo 'C:\Users\viking' | path join -a spam.txt",
example: r"'C:\Users\viking' | path join spam.txt",
result: Some(vec![Value::from(UntaggedValue::filepath(
r"C:\Users\viking\spam.txt",
))]),
},
Example {
description: "Append a filename to a path inside a column",
example: r"ls | path join spam.txt -c [ name ]",
result: None,
},
Example {
description: "Join a list of parts into a path",
example: r"echo [ 'C:' '\' 'Users' 'viking' 'spam.txt' ] | path join",
example: r"[ 'C:' '\' 'Users' 'viking' 'spam.txt' ] | path join",
result: Some(vec![Value::from(UntaggedValue::filepath(
r"C:\Users\viking\spam.txt",
))]),
},
Example {
description: "Join a structured path into a path",
example: r"echo [ [parent stem extension]; ['C:\Users\viking' 'spam' 'txt']] | path join",
example: r"[ [parent stem extension]; ['C:\Users\viking' 'spam' 'txt']] | path join",
result: Some(vec![Value::from(UntaggedValue::filepath(
r"C:\Users\viking\spam.txt",
))]),
@ -90,21 +98,26 @@ the output of 'path parse' and 'path split' subcommands."#
vec![
Example {
description: "Append a filename to a path",
example: r"echo '/home/viking' | path join -a spam.txt",
example: r"'/home/viking' | path join spam.txt",
result: Some(vec![Value::from(UntaggedValue::filepath(
r"/home/viking/spam.txt",
))]),
},
Example {
description: "Append a filename to a path inside a column",
example: r"ls | path join spam.txt -c [ name ]",
result: None,
},
Example {
description: "Join a list of parts into a path",
example: r"echo [ '/' 'home' 'viking' 'spam.txt' ] | path join",
example: r"[ '/' 'home' 'viking' 'spam.txt' ] | path join",
result: Some(vec![Value::from(UntaggedValue::filepath(
r"/home/viking/spam.txt",
))]),
},
Example {
description: "Join a structured path into a path",
example: r"echo [[ parent stem extension ]; [ '/home/viking' 'spam' 'txt' ]] | path join",
example: r"[[ parent stem extension ]; [ '/home/viking' 'spam' 'txt' ]] | path join",
result: Some(vec![Value::from(UntaggedValue::filepath(
r"/home/viking/spam.txt",
))]),

View File

@ -61,7 +61,7 @@ fn encode_path(
ALLOWED_COLUMNS.join(", ")
);
return Err(ShellError::labeled_error_with_secondary(
"Invalid column name",
"Expected structured path table",
msg,
new_span,
"originates from here",
@ -216,3 +216,36 @@ where
operate_column_paths(input, action, span, args)
}
}
fn column_paths_from_args(args: &CommandArgs) -> Result<Vec<ColumnPath>, ShellError> {
let column_paths: Option<Vec<Value>> = args.get_flag("columns")?;
let has_columns = column_paths.is_some();
let column_paths = match column_paths {
Some(cols) => {
let mut c = Vec::new();
for col in cols {
let colpath = ColumnPath::build(&col.convert_to_string().spanned_unknown());
if !colpath.is_empty() {
c.push(colpath)
}
}
c
}
None => Vec::new(),
};
if has_columns && column_paths.is_empty() {
let colval: Option<Value> = args.get_flag("columns")?;
let colspan = match colval {
Some(v) => v.tag.span,
None => Span::unknown(),
};
return Err(ShellError::labeled_error(
"Requires a list of columns",
"must be a list of columns",
colspan,
));
}
Ok(column_paths)
}

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments};
use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
@ -11,13 +11,13 @@ use std::path::Path;
pub struct PathParse;
struct PathParseArguments {
rest: Vec<ColumnPath>,
columns: Vec<ColumnPath>,
extension: Option<Tagged<String>>,
}
impl PathSubcommandArguments for PathParseArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest
&self.columns
}
}
@ -28,10 +28,11 @@ impl WholeStreamCommand for PathParse {
fn signature(&self) -> Signature {
Signature::build("path parse")
.rest(
"rest",
SyntaxShape::ColumnPath,
.named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
.named(
"extension",
@ -53,7 +54,7 @@ On Windows, an extra 'prefix' column is added."#
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathParseArguments {
rest: args.rest(0)?,
columns: column_paths_from_args(&args)?,
extension: args.get_flag("extension")?,
});
@ -65,22 +66,22 @@ On Windows, an extra 'prefix' column is added."#
vec![
Example {
description: "Parse a single path",
example: r"echo 'C:\Users\viking\spam.txt' | path parse",
example: r"'C:\Users\viking\spam.txt' | path parse",
result: None,
},
Example {
description: "Replace a complex extension",
example: r"echo 'C:\Users\viking\spam.tar.gz' | path parse -e tar.gz | update extension { 'txt' }",
example: r"'C:\Users\viking\spam.tar.gz' | path parse -e tar.gz | update extension { 'txt' }",
result: None,
},
Example {
description: "Ignore the extension",
example: r"echo 'C:\Users\viking.d' | path parse -e ''",
example: r"'C:\Users\viking.d' | path parse -e ''",
result: None,
},
Example {
description: "Parse all paths under the 'name' column",
example: r"ls | path parse name",
example: r"ls | path parse -c [ name ]",
result: None,
},
]
@ -91,22 +92,22 @@ On Windows, an extra 'prefix' column is added."#
vec![
Example {
description: "Parse a path",
example: r"echo '/home/viking/spam.txt' | path parse",
example: r"'/home/viking/spam.txt' | path parse",
result: None,
},
Example {
description: "Replace a complex extension",
example: r"echo '/home/viking/spam.tar.gz' | path parse -e tar.gz | update extension { 'txt' }",
example: r"'/home/viking/spam.tar.gz' | path parse -e tar.gz | update extension { 'txt' }",
result: None,
},
Example {
description: "Ignore the extension",
example: r"echo '/etc/conf.d' | path parse -e ''",
example: r"'/etc/conf.d' | path parse -e ''",
result: None,
},
Example {
description: "Parse all paths under the 'name' column",
example: r"ls | path parse name",
example: r"ls | path parse -c [ name ]",
result: None,
},
]

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments};
use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
@ -10,12 +10,12 @@ pub struct PathRelativeTo;
struct PathRelativeToArguments {
path: Tagged<PathBuf>,
rest: Vec<ColumnPath>,
columns: Vec<ColumnPath>,
}
impl PathSubcommandArguments for PathRelativeToArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest
&self.columns
}
}
@ -31,10 +31,11 @@ impl WholeStreamCommand for PathRelativeTo {
SyntaxShape::FilePath,
"Parent shared with the input path",
)
.rest(
"rest",
SyntaxShape::ColumnPath,
.named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
}
@ -52,7 +53,7 @@ path."#
let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathRelativeToArguments {
path: args.req(0)?,
rest: args.rest(1)?,
columns: column_paths_from_args(&args)?,
});
Ok(operate(args.input, &action, tag.span, cmd_args))
@ -66,6 +67,11 @@ path."#
example: r"'C:\Users\viking' | path relative-to 'C:\Users'",
result: Some(vec![Value::from(UntaggedValue::filepath(r"viking"))]),
},
Example {
description: "Find a relative path from two absolute paths in a column",
example: "ls ~ | path relative-to ~ -c [ name ]",
result: None,
},
Example {
description: "Find a relative path from two relative paths",
example: r"'eggs\bacon\sausage\spam' | path relative-to 'eggs\bacon\sausage'",
@ -82,6 +88,11 @@ path."#
example: r"'/home/viking' | path relative-to '/home'",
result: Some(vec![Value::from(UntaggedValue::filepath(r"viking"))]),
},
Example {
description: "Find a relative path from two absolute paths in a column",
example: "ls ~ | path relative-to ~ -c [ name ]",
result: None,
},
Example {
description: "Find a relative path from two relative paths",
example: r"'eggs/bacon/sausage/spam' | path relative-to 'eggs/bacon/sausage'",

View File

@ -1,4 +1,4 @@
use super::{handle_value, operate_column_paths, PathSubcommandArguments};
use super::{column_paths_from_args, handle_value, operate_column_paths, PathSubcommandArguments};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
@ -8,12 +8,12 @@ use std::path::Path;
pub struct PathSplit;
struct PathSplitArguments {
rest: Vec<ColumnPath>,
columns: Vec<ColumnPath>,
}
impl PathSubcommandArguments for PathSplitArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest
&self.columns
}
}
@ -23,10 +23,11 @@ impl WholeStreamCommand for PathSplit {
}
fn signature(&self) -> Signature {
Signature::build("path split").rest(
"rest",
SyntaxShape::ColumnPath,
Signature::build("path split").named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
}
@ -37,7 +38,7 @@ impl WholeStreamCommand for PathSplit {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathSplitArguments {
rest: args.rest(0)?,
columns: column_paths_from_args(&args)?,
});
Ok(operate_split(args.input, &action, tag.span, cmd_args))
@ -48,7 +49,7 @@ impl WholeStreamCommand for PathSplit {
vec![
Example {
description: "Split a path into parts",
example: r"echo 'C:\Users\viking\spam.txt' | path split",
example: r"'C:\Users\viking\spam.txt' | path split",
result: Some(vec![
Value::from(UntaggedValue::string("C:")),
Value::from(UntaggedValue::string(r"\")),
@ -59,7 +60,7 @@ impl WholeStreamCommand for PathSplit {
},
Example {
description: "Split all paths under the 'name' column",
example: r"ls | path split name",
example: r"ls ('.' | path expand) | path split -c [ name ]",
result: None,
},
]
@ -70,7 +71,7 @@ impl WholeStreamCommand for PathSplit {
vec![
Example {
description: "Split a path into parts",
example: r"echo '/home/viking/spam.txt' | path split",
example: r"'/home/viking/spam.txt' | path split",
result: Some(vec![
Value::from(UntaggedValue::string("/")),
Value::from(UntaggedValue::string("home")),
@ -80,7 +81,7 @@ impl WholeStreamCommand for PathSplit {
},
Example {
description: "Split all paths under the 'name' column",
example: r"ls | path split name",
example: r"ls ('.' | path expand) | path split -c [ name ]",
result: None,
},
]

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments};
use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*;
use nu_engine::filesystem::filesystem_shell::get_file_type;
use nu_engine::WholeStreamCommand;
@ -9,12 +9,12 @@ use std::path::Path;
pub struct PathType;
struct PathTypeArguments {
rest: Vec<ColumnPath>,
columns: Vec<ColumnPath>,
}
impl PathSubcommandArguments for PathTypeArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest
&self.columns
}
}
@ -24,10 +24,11 @@ impl WholeStreamCommand for PathType {
}
fn signature(&self) -> Signature {
Signature::build("path type").rest(
"rest",
SyntaxShape::ColumnPath,
Signature::build("path type").named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
}
@ -38,18 +39,25 @@ impl WholeStreamCommand for PathType {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathTypeArguments {
rest: args.rest(0)?,
columns: column_paths_from_args(&args)?,
});
Ok(operate(args.input, &action, tag.span, cmd_args))
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Show type of a filepath",
example: "echo '.' | path type",
result: Some(vec![Value::from("Dir")]),
}]
vec![
Example {
description: "Show type of a filepath",
example: "'.' | path type",
result: Some(vec![Value::from("Dir")]),
},
Example {
description: "Show type of a filepath in a column",
example: "ls | path type -c [ name ]",
result: None,
},
]
}
}

View File

@ -170,7 +170,7 @@ fn action(
Ok(UntaggedValue::string(gradient_string).into_value(tag))
}
(None, Some(fg_end), None, Some(bg_end)) => {
// missin fg_start and bg_start, so assume black
// missing fg_start and bg_start, so assume black
let fg_start = Rgb::new(0, 0, 0);
let bg_start = Rgb::new(0, 0, 0);
let fg_gradient = Gradient::new(fg_start, fg_end);

View File

@ -1,105 +0,0 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, Value};
use arboard::Clipboard;
pub struct Clip;
impl WholeStreamCommand for Clip {
fn name(&self) -> &str {
"clip"
}
fn signature(&self) -> Signature {
Signature::build("clip")
}
fn usage(&self) -> &str {
"Copy the contents of the pipeline to the copy/paste buffer."
}
fn run_with_actions(&self, args: CommandArgs) -> Result<ActionStream, ShellError> {
clip(args)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Save text to the clipboard",
example: "echo 'secret value' | clip",
result: None,
},
Example {
description: "Save numbers to the clipboard",
example: "random integer 10000000..99999999 | clip",
result: None,
},
]
}
}
pub fn clip(args: CommandArgs) -> Result<ActionStream, ShellError> {
let input = args.input;
let name = args.call_info.name_tag;
let values: Vec<Value> = input.collect();
if let Ok(mut clip_context) = Clipboard::new() {
let mut new_copy_data = String::new();
if !values.is_empty() {
let mut first = true;
for i in &values {
if !first {
new_copy_data.push('\n');
} else {
first = false;
}
let string: String = i.convert_to_string();
if string.is_empty() {
return Err(ShellError::labeled_error(
"Unable to convert to string",
"Unable to convert to string",
name,
));
}
new_copy_data.push_str(&string);
}
}
match clip_context.set_text(new_copy_data) {
Ok(_) => {}
Err(_) => {
return Err(ShellError::labeled_error(
"Could not set contents of clipboard",
"could not set contents of clipboard",
name,
));
}
}
} else {
return Err(ShellError::labeled_error(
"Could not open clipboard",
"could not open clipboard",
name,
));
}
Ok(ActionStream::empty())
}
#[cfg(test)]
mod tests {
use super::Clip;
use super::ShellError;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(Clip {})
}
}

View File

@ -104,7 +104,7 @@ fn kill(args: CommandArgs) -> Result<ActionStream, ShellError> {
}
cmd.arg("-9");
} else if let Some(signal_value) = signal {
cmd.arg(format!("-{}", signal_value.item().to_string()));
cmd.arg(format!("-{}", signal_value.item()));
}
cmd.arg(pid.item().to_string());

View File

@ -1,13 +1,9 @@
mod ansi;
mod benchmark;
mod clear;
#[cfg(feature = "clipboard-cli")]
mod clip;
mod du;
mod exec;
mod kill;
#[cfg(feature = "clipboard-cli")]
mod paste;
mod pwd;
mod run_external;
mod sleep;
@ -17,13 +13,9 @@ mod which_;
pub use ansi::*;
pub use benchmark::Benchmark;
pub use clear::Clear;
#[cfg(feature = "clipboard-cli")]
pub use clip::Clip;
pub use du::Du;
pub use exec::Exec;
pub use kill::Kill;
#[cfg(feature = "clipboard-cli")]
pub use paste::Paste;
pub use pwd::Pwd;
pub use run_external::RunExternalCommand;
pub use sleep::Sleep;

View File

@ -1,61 +0,0 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue};
use arboard::Clipboard;
pub struct Paste;
impl WholeStreamCommand for Paste {
fn name(&self) -> &str {
"paste"
}
fn signature(&self) -> Signature {
Signature::build("paste")
}
fn usage(&self) -> &str {
"Paste contents from the clipboard"
}
fn run_with_actions(&self, args: CommandArgs) -> Result<ActionStream, ShellError> {
paste(args)
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Paste text from your clipboard",
example: "echo 'secret value' | clip | paste",
result: Some(vec![UntaggedValue::Primitive(Primitive::String(
"secret value".to_owned(),
))
.into_value(Tag::default())]),
}]
}
}
pub fn paste(args: CommandArgs) -> Result<ActionStream, ShellError> {
let name = args.call_info.name_tag;
if let Ok(mut clip_context) = Clipboard::new() {
match clip_context.get_text() {
Ok(out) => Ok(ActionStream::one(ReturnSuccess::value(
UntaggedValue::Primitive(Primitive::String(out)),
))),
Err(_) => Err(ShellError::labeled_error(
"Could not get contents of clipboard",
"could not get contents of clipboard",
name,
)),
}
} else {
Err(ShellError::labeled_error(
"Could not open clipboard",
"could not open clipboard",
name,
))
}
}

View File

@ -6,12 +6,6 @@ use nu_protocol::{Dictionary, Signature, UntaggedValue};
pub struct TermSize;
#[derive(Deserialize, Clone)]
pub struct TermSizeArgs {
wide: bool,
tall: bool,
}
impl WholeStreamCommand for TermSize {
fn name(&self) -> &str {
"term size"

View File

@ -0,0 +1,51 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{CommandAction, ReturnSuccess, Signature, SyntaxShape};
pub struct Goto;
impl WholeStreamCommand for Goto {
fn name(&self) -> &str {
"g"
}
fn signature(&self) -> Signature {
Signature::build("g").required("index", SyntaxShape::Int, "the shell's index to go to")
}
fn usage(&self) -> &str {
"Go to specified shell."
}
fn run_with_actions(&self, args: CommandArgs) -> Result<ActionStream, ShellError> {
goto(args)
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Enter the first shell",
example: "g 0",
result: None,
}]
}
}
fn goto(args: CommandArgs) -> Result<ActionStream, ShellError> {
Ok(ActionStream::one(ReturnSuccess::action(
CommandAction::GotoShell(args.req(0)?),
)))
}
#[cfg(test)]
mod tests {
use super::Goto;
use super::ShellError;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(Goto {})
}
}

View File

@ -1,11 +1,13 @@
mod command;
mod enter;
mod exit;
mod goto;
mod next;
mod prev;
pub use command::Shells;
pub use enter::Enter;
pub use exit::Exit;
pub use goto::Goto;
pub use next::Next;
pub use prev::Previous;

View File

@ -0,0 +1,283 @@
use std::{iter::Peekable, str::CharIndices};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue};
use nu_source::Spanned;
type Input<'t> = Peekable<CharIndices<'t>>;
pub struct DetectColumns;
impl WholeStreamCommand for DetectColumns {
fn name(&self) -> &str {
"detect columns"
}
fn signature(&self) -> Signature {
Signature::build("detect columns")
.named(
"skip",
SyntaxShape::Int,
"number of rows to skip before detecting",
Some('s'),
)
.switch("no_headers", "don't detect headers", Some('n'))
}
fn usage(&self) -> &str {
"splits contents across multiple columns via the separator."
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
detect_columns(args)
}
}
fn detect_columns(args: CommandArgs) -> Result<OutputStream, ShellError> {
let name_tag = args.name_tag();
let num_rows_to_skip: Option<usize> = args.get_flag("skip")?;
let noheader = args.has_flag("no_headers");
let input = args.input.collect_string(name_tag.clone())?;
let input: Vec<_> = input
.lines()
.skip(num_rows_to_skip.unwrap_or_default())
.map(|x| x.to_string())
.collect();
let mut input = input.into_iter();
let headers = input.next();
if let Some(orig_headers) = headers {
let headers = find_columns(&orig_headers);
Ok((if noheader {
vec![orig_headers].into_iter().chain(input)
} else {
vec![].into_iter().chain(input)
})
.map(move |x| {
let row = find_columns(&x);
let mut dict = TaggedDictBuilder::new(name_tag.clone());
if headers.len() == row.len() && !noheader {
for (header, val) in headers.iter().zip(row.iter()) {
dict.insert_untagged(&header.item, UntaggedValue::string(&val.item));
}
} else {
let mut pre_output = vec![];
// column counts don't line up, so see if we can figure out why
for cell in row {
for header in &headers {
if cell.span.start() <= header.span.end()
&& cell.span.end() > header.span.start()
{
pre_output
.push((header.item.to_string(), UntaggedValue::string(&cell.item)));
}
}
}
for header in &headers {
let mut found = false;
for pre_o in &pre_output {
if pre_o.0 == header.item {
found = true;
break;
}
}
if !found {
pre_output.push((header.item.to_string(), UntaggedValue::nothing()));
}
}
if noheader {
for header in headers.iter().enumerate() {
for pre_o in &pre_output {
if pre_o.0 == header.1.item {
dict.insert_untagged(format!("Column{}", header.0), pre_o.1.clone())
}
}
}
} else {
for header in &headers {
for pre_o in &pre_output {
if pre_o.0 == header.item {
dict.insert_untagged(&header.item, pre_o.1.clone())
}
}
}
}
}
dict.into_value()
})
.into_output_stream())
} else {
Ok(OutputStream::empty())
}
}
pub fn find_columns(input: &str) -> Vec<Spanned<String>> {
let mut chars = input.char_indices().peekable();
let mut output = vec![];
while let Some((_, c)) = chars.peek() {
if c.is_whitespace() {
// If the next character is non-newline whitespace, skip it.
let _ = chars.next();
} else {
// Otherwise, try to consume an unclassified token.
let result = baseline(&mut chars);
output.push(result);
}
}
output
}
#[derive(Clone, Copy)]
enum BlockKind {
Paren,
CurlyBracket,
SquareBracket,
}
fn baseline(src: &mut Input) -> Spanned<String> {
let mut token_contents = String::new();
let start_offset = if let Some((pos, _)) = src.peek() {
*pos
} else {
0
};
// This variable tracks the starting character of a string literal, so that
// we remain inside the string literal lexer mode until we encounter the
// closing quote.
let mut quote_start: Option<char> = None;
// This Vec tracks paired delimiters
let mut block_level: Vec<BlockKind> = vec![];
// A baseline token is terminated if it's not nested inside of a paired
// delimiter and the next character is one of: `|`, `;`, `#` or any
// whitespace.
fn is_termination(block_level: &[BlockKind], c: char) -> bool {
block_level.is_empty() && (c.is_whitespace())
}
// The process of slurping up a baseline token repeats:
//
// - String literal, which begins with `'`, `"` or `\``, and continues until
// the same character is encountered again.
// - Delimiter pair, which begins with `[`, `(`, or `{`, and continues until
// the matching closing delimiter is found, skipping comments and string
// literals.
// - When not nested inside of a delimiter pair, when a terminating
// character (whitespace, `|`, `;` or `#`) is encountered, the baseline
// token is done.
// - Otherwise, accumulate the character into the current baseline token.
while let Some((_, c)) = src.peek() {
let c = *c;
if quote_start.is_some() {
// If we encountered the closing quote character for the current
// string, we're done with the current string.
if Some(c) == quote_start {
quote_start = None;
}
} else if c == '\n' {
if is_termination(&block_level, c) {
break;
}
} else if c == '\'' || c == '"' || c == '`' {
// We encountered the opening quote of a string literal.
quote_start = Some(c);
} else if c == '[' {
// We encountered an opening `[` delimiter.
block_level.push(BlockKind::SquareBracket);
} else if c == ']' {
// We encountered a closing `]` delimiter. Pop off the opening `[`
// delimiter.
if let Some(BlockKind::SquareBracket) = block_level.last() {
let _ = block_level.pop();
}
} else if c == '{' {
// We encountered an opening `{` delimiter.
block_level.push(BlockKind::CurlyBracket);
} else if c == '}' {
// We encountered a closing `}` delimiter. Pop off the opening `{`.
if let Some(BlockKind::CurlyBracket) = block_level.last() {
let _ = block_level.pop();
}
} else if c == '(' {
// We enceountered an opening `(` delimiter.
block_level.push(BlockKind::Paren);
} else if c == ')' {
// We encountered a closing `)` delimiter. Pop off the opening `(`.
if let Some(BlockKind::Paren) = block_level.last() {
let _ = block_level.pop();
}
} else if is_termination(&block_level, c) {
break;
}
// Otherwise, accumulate the character into the current token.
token_contents.push(c);
// Consume the character.
let _ = src.next();
}
let span = Span::new(start_offset, start_offset + token_contents.len());
// If there is still unclosed opening delimiters, close them and add
// synthetic closing characters to the accumulated token.
if block_level.last().is_some() {
// let delim: char = (*block).closing();
// let cause = ParseError::unexpected_eof(delim.to_string(), span);
// while let Some(bk) = block_level.pop() {
// token_contents.push(bk.closing());
// }
return token_contents.spanned(span);
}
if quote_start.is_some() {
// The non-lite parse trims quotes on both sides, so we add the expected quote so that
// anyone wanting to consume this partial parse (e.g., completions) will be able to get
// correct information from the non-lite parse.
// token_contents.push(delimiter);
// return (
// token_contents.spanned(span),
// Some(ParseError::unexpected_eof(delimiter.to_string(), span)),
// );
return token_contents.spanned(span);
}
token_contents.spanned(span)
}
#[cfg(test)]
mod tests {
use super::DetectColumns;
use super::ShellError;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(DetectColumns {})
}
}

View File

@ -0,0 +1,3 @@
pub mod columns;
pub use columns::DetectColumns;

View File

@ -1,5 +1,6 @@
mod build_string;
mod char_;
mod detect;
mod format;
mod lines;
mod parse;
@ -10,6 +11,7 @@ mod str_;
pub use build_string::BuildString;
pub use char_::Char;
pub use detect::DetectColumns;
pub use format::*;
pub use lines::Lines;
pub use parse::*;

View File

@ -1,6 +1,5 @@
use super::operate;
use super::{operate, to_lower_camel_case};
use crate::prelude::*;
use inflector::cases::camelcase::to_camel_case;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};
@ -25,7 +24,7 @@ impl WholeStreamCommand for SubCommand {
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
operate(args, &to_camel_case)
operate(args, &to_lower_camel_case)
}
fn examples(&self) -> Vec<Example> {
@ -40,7 +39,7 @@ impl WholeStreamCommand for SubCommand {
#[cfg(test)]
mod tests {
use super::ShellError;
use super::{to_camel_case, SubCommand};
use super::{to_lower_camel_case, SubCommand};
use crate::commands::strings::str_::case::action;
use nu_source::Tag;
use nu_test_support::value::string;
@ -57,7 +56,7 @@ mod tests {
let word = string("this-is-the-first-case");
let expected = string("thisIsTheFirstCase");
let actual = action(&word, Tag::unknown(), &to_camel_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_lower_camel_case).unwrap();
assert_eq!(actual, expected);
}
#[test]
@ -65,7 +64,7 @@ mod tests {
let word = string("this_is_the_second_case");
let expected = string("thisIsTheSecondCase");
let actual = action(&word, Tag::unknown(), &to_camel_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_lower_camel_case).unwrap();
assert_eq!(actual, expected);
}
}

View File

@ -1,6 +1,5 @@
use super::operate;
use super::{operate, to_kebab_case};
use crate::prelude::*;
use inflector::cases::kebabcase::to_kebab_case;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};

View File

@ -16,6 +16,24 @@ pub use pascal_case::SubCommand as PascalCase;
pub use screaming_snake_case::SubCommand as ScreamingSnakeCase;
pub use snake_case::SubCommand as SnakeCase;
use heck::ToKebabCase;
use heck::ToLowerCamelCase;
use heck::ToShoutySnakeCase;
use heck::ToSnakeCase;
use heck::ToUpperCamelCase;
macro_rules! create_heck_function {
($func_name:ident) => {
pub fn $func_name(a_slice: &str) -> String {
a_slice.$func_name()
}
};
}
create_heck_function!(to_upper_camel_case);
create_heck_function!(to_lower_camel_case);
create_heck_function!(to_kebab_case);
create_heck_function!(to_shouty_snake_case);
create_heck_function!(to_snake_case);
struct Arguments {
column_paths: Vec<ColumnPath>,
}

View File

@ -1,6 +1,5 @@
use super::operate;
use super::{operate, to_upper_camel_case};
use crate::prelude::*;
use inflector::cases::pascalcase::to_pascal_case;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};
@ -25,7 +24,7 @@ impl WholeStreamCommand for SubCommand {
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
operate(args, &to_pascal_case)
operate(args, &to_upper_camel_case)
}
fn examples(&self) -> Vec<Example> {
@ -40,7 +39,7 @@ impl WholeStreamCommand for SubCommand {
#[cfg(test)]
mod tests {
use super::ShellError;
use super::{to_pascal_case, SubCommand};
use super::{to_upper_camel_case, SubCommand};
use crate::commands::strings::str_::case::action;
use nu_source::Tag;
use nu_test_support::value::string;
@ -57,7 +56,7 @@ mod tests {
let word = string("this-is-the-first-case");
let expected = string("ThisIsTheFirstCase");
let actual = action(&word, Tag::unknown(), &to_pascal_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_upper_camel_case).unwrap();
assert_eq!(actual, expected);
}
#[test]
@ -65,7 +64,7 @@ mod tests {
let word = string("this_is_the_second_case");
let expected = string("ThisIsTheSecondCase");
let actual = action(&word, Tag::unknown(), &to_pascal_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_upper_camel_case).unwrap();
assert_eq!(actual, expected);
}
}

View File

@ -1,6 +1,5 @@
use super::operate;
use super::{operate, to_shouty_snake_case};
use crate::prelude::*;
use inflector::cases::screamingsnakecase::to_screaming_snake_case;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};
@ -25,7 +24,7 @@ impl WholeStreamCommand for SubCommand {
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
operate(args, &to_screaming_snake_case)
operate(args, &to_shouty_snake_case)
}
fn examples(&self) -> Vec<Example> {
@ -40,7 +39,7 @@ impl WholeStreamCommand for SubCommand {
#[cfg(test)]
mod tests {
use super::ShellError;
use super::{to_screaming_snake_case, SubCommand};
use super::{to_shouty_snake_case, SubCommand};
use crate::commands::strings::str_::case::action;
use nu_source::Tag;
use nu_test_support::value::string;
@ -57,7 +56,7 @@ mod tests {
let word = string("this-is-the-first-case");
let expected = string("THIS_IS_THE_FIRST_CASE");
let actual = action(&word, Tag::unknown(), &to_screaming_snake_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_shouty_snake_case).unwrap();
assert_eq!(actual, expected);
}
#[test]
@ -65,7 +64,7 @@ mod tests {
let word = string("this_is_the_second_case");
let expected = string("THIS_IS_THE_SECOND_CASE");
let actual = action(&word, Tag::unknown(), &to_screaming_snake_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_shouty_snake_case).unwrap();
assert_eq!(actual, expected);
}
}

View File

@ -1,6 +1,5 @@
use super::operate;
use super::{operate, to_snake_case};
use crate::prelude::*;
use inflector::cases::snakecase::to_snake_case;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};

View File

@ -144,7 +144,7 @@ fn trim(s: &str, char_: Option<char>, closure_flags: &ClosureFlags) -> String {
let re_str = format!("{}{{2,}}", reg);
// create the regex
let re = regex::Regex::new(&re_str).expect("Error creating regular expression");
// replace all mutliple occurances with single occurences represented by r
// replace all multiple occurrences with single occurrences represented by r
let new_str = re.replace_all(&return_string, r.to_string());
// update the return string so the next loop has the latest changes
return_string = new_str.to_string();

View File

@ -1,7 +1,7 @@
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, TaggedDictBuilder, UntaggedValue};
use sysinfo::{ProcessExt, System, SystemExt};
use sysinfo::{PidExt, ProcessExt, System, SystemExt};
pub struct Command;
@ -50,7 +50,7 @@ fn run_ps(args: CommandArgs) -> Result<OutputStream, ShellError> {
for pid in result {
if let Some(result) = sys.process(pid) {
let mut dict = TaggedDictBuilder::new(args.name_tag());
dict.insert_untagged("pid", UntaggedValue::int(pid as i64));
dict.insert_untagged("pid", UntaggedValue::int(pid.as_u32() as i64));
dict.insert_untagged("name", UntaggedValue::string(result.name()));
dict.insert_untagged(
"status",
@ -68,7 +68,7 @@ fn run_ps(args: CommandArgs) -> Result<OutputStream, ShellError> {
if long {
if let Some(parent) = result.parent() {
dict.insert_untagged("parent", UntaggedValue::int(parent as i64));
dict.insert_untagged("parent", UntaggedValue::int(parent.as_u32() as i64));
} else {
dict.insert_untagged("parent", UntaggedValue::nothing());
}

View File

@ -11,13 +11,6 @@ use std::collections::HashMap;
use std::sync::atomic::Ordering;
use std::time::Instant;
#[cfg(feature = "table-pager")]
use {
futures::future::join,
minus::{ExitStrategy, Pager},
std::fmt::Write,
};
const STREAM_PAGE_SIZE: usize = 1000;
const STREAM_TIMEOUT_CHECK_INTERVAL: usize = 100;
@ -186,28 +179,9 @@ fn table(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let term_width = args.host().lock().width();
#[cfg(feature = "table-pager")]
let pager = Pager::new()
.set_exit_strategy(ExitStrategy::PagerQuit)
.set_searchable(true)
.set_page_if_havent_overflowed(false)
.set_input_handler(Box::new(input_handling::MinusInputHandler {}))
.finish();
let stream_data = async {
let finished = Arc::new(AtomicBool::new(false));
// we are required to clone finished, for use within the callback, otherwise we get borrow errors
#[cfg(feature = "table-pager")]
let finished_within_callback = finished.clone();
#[cfg(feature = "table-pager")]
{
// This is called when the pager finishes, to indicate to the
// while loop below to finish, in case of long running InputStream consumer
// that doesn't finish by the time the user quits out of the pager
pager.lock().await.add_exit_callback(move || {
finished_within_callback.store(true, Ordering::Relaxed);
});
}
while !finished.clone().load(Ordering::Relaxed) {
let mut new_input: VecDeque<Value> = VecDeque::new();
@ -263,161 +237,22 @@ fn table(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
if !input.is_empty() {
let t = from_list(&input, &configuration, start_number, &color_hm);
let output = draw_table(&t, term_width, &color_hm);
#[cfg(feature = "table-pager")]
{
let mut pager = pager.lock().await;
writeln!(pager.lines, "{}", output).map_err(|_| {
ShellError::untagged_runtime_error("Error writing to pager")
})?;
}
#[cfg(not(feature = "table-pager"))]
println!("{}", output);
}
start_number += input.len();
}
#[cfg(feature = "table-pager")]
{
let mut pager_lock = pager.lock().await;
pager_lock.data_finished();
}
Result::<_, ShellError>::Ok(())
};
#[cfg(feature = "table-pager")]
{
let (minus_result, streaming_result) =
block_on(join(minus::async_std_updating(pager.clone()), stream_data));
minus_result.map_err(|_| ShellError::untagged_runtime_error("Error paging data"))?;
streaming_result?;
}
#[cfg(not(feature = "table-pager"))]
block_on(stream_data)
.map_err(|_| ShellError::untagged_runtime_error("Error streaming data"))?;
Ok(OutputStream::empty())
}
#[cfg(feature = "table-pager")]
mod input_handling {
use crossterm::event::{Event, KeyCode, KeyEvent, KeyModifiers, MouseEvent, MouseEventKind};
use minus::{InputEvent, InputHandler, LineNumbers, SearchMode};
pub struct MinusInputHandler;
impl InputHandler for MinusInputHandler {
fn handle_input(
&self,
ev: Event,
upper_mark: usize,
search_mode: SearchMode,
ln: LineNumbers,
rows: usize,
) -> Option<InputEvent> {
match ev {
// Scroll up by one.
Event::Key(KeyEvent {
code: KeyCode::Up,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(upper_mark.saturating_sub(1))),
// Scroll down by one.
Event::Key(KeyEvent {
code: KeyCode::Down,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(upper_mark.saturating_add(1))),
// Mouse scroll up/down
Event::Mouse(MouseEvent {
kind: MouseEventKind::ScrollUp,
..
}) => Some(InputEvent::UpdateUpperMark(upper_mark.saturating_sub(5))),
Event::Mouse(MouseEvent {
kind: MouseEventKind::ScrollDown,
..
}) => Some(InputEvent::UpdateUpperMark(upper_mark.saturating_add(5))),
// Go to top.
Event::Key(KeyEvent {
code: KeyCode::Home,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(0)),
// Go to bottom.
Event::Key(KeyEvent {
code: KeyCode::End,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(usize::MAX)),
// Page Up/Down
Event::Key(KeyEvent {
code: KeyCode::PageUp,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(
upper_mark.saturating_sub(rows - 1),
)),
Event::Key(KeyEvent {
code: KeyCode::PageDown,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(
upper_mark.saturating_add(rows - 1),
)),
// Resize event from the terminal.
Event::Resize(_, height) => Some(InputEvent::UpdateRows(height as usize)),
// Switch line number display.
Event::Key(KeyEvent {
code: KeyCode::Char('l'),
modifiers: KeyModifiers::CONTROL,
}) => Some(InputEvent::UpdateLineNumber(!ln)),
// Quit.
Event::Key(KeyEvent {
code: KeyCode::Char('q'),
modifiers: KeyModifiers::NONE,
})
| Event::Key(KeyEvent {
code: KeyCode::Char('Q'),
modifiers: KeyModifiers::SHIFT,
})
| Event::Key(KeyEvent {
code: KeyCode::Esc,
modifiers: KeyModifiers::NONE,
})
| Event::Key(KeyEvent {
code: KeyCode::Char('c'),
modifiers: KeyModifiers::CONTROL,
}) => Some(InputEvent::Exit),
Event::Key(KeyEvent {
code: KeyCode::Char('/'),
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::Search(SearchMode::Unknown)),
Event::Key(KeyEvent {
code: KeyCode::Down,
modifiers: KeyModifiers::CONTROL,
}) => {
if search_mode == SearchMode::Unknown {
Some(InputEvent::NextMatch)
} else {
None
}
}
Event::Key(KeyEvent {
code: KeyCode::Up,
modifiers: KeyModifiers::CONTROL,
}) => {
if search_mode == SearchMode::Unknown {
Some(InputEvent::PrevMatch)
} else {
None
}
}
_ => None,
}
}
}
}
#[cfg(test)]
mod tests {
use super::Command;

View File

@ -76,17 +76,14 @@ impl ConfigExtensions for NuConfig {
fn header_style(&self) -> TextStyle {
// FIXME: I agree, this is the long way around, please suggest and alternative.
let head_color = get_color_from_key_and_subkey(self, "color_config", "header_color");
let head_color_style = match head_color {
Some(s) => {
lookup_ansi_color_style(s.as_string().unwrap_or_else(|_| "green".to_string()))
}
None => nu_ansi_term::Color::Green.normal(),
};
let head_bold = get_color_from_key_and_subkey(self, "color_config", "header_bold");
let head_bold_bool = match head_bold {
Some(b) => header_bold_from_value(Some(&b)),
None => true,
let (head_color_style, head_bold_bool) = match head_color {
Some(s) => (
lookup_ansi_color_style(s.as_string().unwrap_or_else(|_| "green".to_string())),
header_bold_from_value(Some(&s)),
),
None => (nu_ansi_term::Color::Green.normal(), true),
};
let head_align = get_color_from_key_and_subkey(self, "color_config", "header_align");
let head_alignment = match head_align {
Some(a) => header_alignment_from_value(Some(&a)),

View File

@ -77,6 +77,7 @@ pub fn create_default_context(interactive: bool) -> Result<EvaluationContext, Bo
// Shells
whole_stream_command(Next),
whole_stream_command(Previous),
whole_stream_command(Goto),
whole_stream_command(Shells),
whole_stream_command(Enter),
whole_stream_command(Exit),
@ -126,6 +127,7 @@ pub fn create_default_context(interactive: bool) -> Result<EvaluationContext, Bo
whole_stream_command(AnsiStrip),
whole_stream_command(AnsiGradient),
whole_stream_command(Char),
whole_stream_command(DetectColumns),
// Column manipulation
whole_stream_command(DropColumn),
whole_stream_command(MoveColumn),
@ -133,9 +135,11 @@ pub fn create_default_context(interactive: bool) -> Result<EvaluationContext, Bo
whole_stream_command(Select),
whole_stream_command(Get),
whole_stream_command(Update),
whole_stream_command(UpdateCells),
whole_stream_command(Insert),
whole_stream_command(Into),
whole_stream_command(IntoBinary),
whole_stream_command(IntoColumnPath),
whole_stream_command(IntoInt),
whole_stream_command(IntoFilepath),
whole_stream_command(IntoFilesize),
@ -362,14 +366,6 @@ pub fn create_default_context(interactive: bool) -> Result<EvaluationContext, Bo
whole_stream_command(DataFrameCumulative),
whole_stream_command(DataFrameRename),
]);
#[cfg(feature = "clipboard-cli")]
{
context.add_commands(vec![
whole_stream_command(crate::commands::Clip),
whole_stream_command(crate::commands::Paste),
]);
}
}
Ok(context)

View File

@ -21,8 +21,8 @@ use crate::commands::{
};
use crate::commands::{
Append, BuildString, Collect, Each, Echo, First, Get, Keep, Last, Let, Math, MathMode, Nth,
Select, StrCollect, Wrap,
Append, BuildString, Collect, Each, Echo, First, Get, If, IntoInt, Keep, Last, Let, Math,
MathMode, Nth, Select, StrCollect, Wrap,
};
use nu_engine::{run_block, whole_stream_command, Command, EvaluationContext, WholeStreamCommand};
use nu_stream::InputStream;
@ -41,6 +41,8 @@ pub fn test_examples(cmd: Command) -> Result<(), ShellError> {
whole_stream_command(BuildString {}),
whole_stream_command(First {}),
whole_stream_command(Get {}),
whole_stream_command(If {}),
whole_stream_command(IntoInt {}),
whole_stream_command(Keep {}),
whole_stream_command(Each {}),
whole_stream_command(Last {}),
@ -253,6 +255,8 @@ pub fn test_anchors(cmd: Command) -> Result<(), ShellError> {
whole_stream_command(BuildString {}),
whole_stream_command(First {}),
whole_stream_command(Get {}),
whole_stream_command(If {}),
whole_stream_command(IntoInt {}),
whole_stream_command(Keep {}),
whole_stream_command(Each {}),
whole_stream_command(Last {}),

View File

@ -8,7 +8,7 @@ fn returns_path_joined_with_column_path() {
cwd: "tests", pipeline(
r#"
echo [ [name]; [eggs] ]
| path join -a spam.txt name
| path join spam.txt -c [ name ]
| get name
"#
));
@ -23,7 +23,7 @@ fn returns_path_joined_from_list() {
cwd: "tests", pipeline(
r#"
echo [ home viking spam.txt ]
| path join
| path join
"#
));
@ -37,7 +37,7 @@ fn appends_slash_when_joined_with_empty_path() {
cwd: "tests", pipeline(
r#"
echo "/some/dir"
| path join -a ''
| path join ''
"#
));
@ -51,7 +51,7 @@ fn returns_joined_path_when_joining_empty_path() {
cwd: "tests", pipeline(
r#"
echo ""
| path join -a foo.txt
| path join foo.txt
"#
));

View File

@ -48,7 +48,7 @@ fn parses_custom_extension_gets_extension() {
let actual = nu!(
cwd: "tests", pipeline(
r#"
echo 'home/viking/spam.tar.gz'
echo 'home/viking/spam.tar.gz'
| path parse -e tar.gz
| get extension
"#
@ -62,7 +62,7 @@ fn parses_custom_extension_gets_stem() {
let actual = nu!(
cwd: "tests", pipeline(
r#"
echo 'home/viking/spam.tar.gz'
echo 'home/viking/spam.tar.gz'
| path parse -e tar.gz
| get stem
"#
@ -76,7 +76,7 @@ fn parses_ignoring_extension_gets_extension() {
let actual = nu!(
cwd: "tests", pipeline(
r#"
echo 'home/viking/spam.tar.gz'
echo 'home/viking/spam.tar.gz'
| path parse -e ''
| get extension
"#
@ -90,7 +90,7 @@ fn parses_ignoring_extension_gets_stem() {
let actual = nu!(
cwd: "tests", pipeline(
r#"
echo 'home/viking/spam.tar.gz'
echo 'home/viking/spam.tar.gz'
| path parse -e ""
| get stem
"#
@ -105,7 +105,7 @@ fn parses_column_path_extension() {
cwd: "tests", pipeline(
r#"
echo [[home, barn]; ['home/viking/spam.txt', 'barn/cow/moo.png']]
| path parse home barn
| path parse -c [ home barn ]
| get barn
| get extension
"#

View File

@ -18,7 +18,7 @@ fn splits_correctly_single_path() {
cwd: "tests", pipeline(
r#"
echo ['home/viking/spam.txt']
| path split
| path split
| last
"#
));
@ -37,7 +37,7 @@ fn splits_correctly_with_column_path() {
['home/viking/spam.txt', 'barn/cow/moo.png']
['home/viking/eggs.txt', 'barn/goat/cheese.png']
]
| path split home barn
| path split -c [ home barn ]
| get barn
| length
"#

View File

@ -306,3 +306,21 @@ fn rm_wildcard_leading_dot_deletes_dotfiles() {
assert!(!files_exist_at(vec![".bar"], dirs.test()));
})
}
#[test]
fn removes_files_with_case_sensitive_glob_matches_by_default() {
Playground::setup("glob_test", |dirs, sandbox| {
sandbox.with_files(vec![EmptyFile("A0"), EmptyFile("a1")]);
nu!(
cwd: dirs.root(),
"rm glob_test/A*"
);
let deleted_path = dirs.test().join("A0");
let skipped_path = dirs.test().join("a1");
assert!(!deleted_path.exists());
assert!(skipped_path.exists());
})
}

View File

@ -47,3 +47,21 @@ fn writes_out_csv() {
assert!(actual.contains("nu,0.14,A new type of shell,MIT,2018"));
})
}
#[test]
fn save_append_will_create_file_if_not_exists() {
Playground::setup("save_test_3", |dirs, sandbox| {
sandbox.with_files(vec![]);
let expected_file = dirs.test().join("new-file.txt");
nu!(
cwd: dirs.root(),
r#"echo hello | save --raw --append save_test_3/new-file.txt"#,
);
let actual = file_contents(expected_file);
println!("{}", actual);
assert!(actual == "hello");
})
}

View File

@ -4,22 +4,19 @@ description = "Completions for nushell"
edition = "2018"
license = "MIT"
name = "nu-completion"
version = "0.37.0"
version = "0.43.0"
[lib]
doctest = false
[dependencies]
nu-engine = { version = "0.37.0", path="../nu-engine" }
nu-data = { version = "0.37.0", path="../nu-data" }
nu-errors = { version = "0.37.0", path="../nu-errors" }
nu-parser = { version = "0.37.0", path="../nu-parser" }
nu-path = { version = "0.37.0", path="../nu-path" }
nu-protocol = { version = "0.37.0", path="../nu-protocol" }
nu-source = { version = "0.37.0", path="../nu-source" }
nu-test-support = { version = "0.37.0", path="../nu-test-support" }
dirs-next = "2.0.0"
nu-engine = { version = "0.43.0", path="../nu-engine" }
nu-data = { version = "0.43.0", path="../nu-data" }
nu-parser = { version = "0.43.0", path="../nu-parser" }
nu-path = { version = "0.43.0", path="../nu-path" }
nu-protocol = { version = "0.43.0", path="../nu-protocol" }
nu-source = { version = "0.43.0", path="../nu-source" }
nu-test-support = { version = "0.43.0", path="../nu-test-support" }
indexmap = { version="1.6.1", features=["serde-1"] }
[target.'cfg(not(target_arch = "wasm32"))'.dependencies]

View File

@ -238,11 +238,19 @@ pub fn completion_location(line: &str, block: &Block, pos: usize) -> Vec<Complet
}
}
output.push(loc.clone());
output.push({
let mut partial_loc = loc.clone();
partial_loc.span = Span::new(loc.span.start(), pos);
partial_loc
});
output
}
}
_ => vec![loc.clone()],
_ => vec![{
let mut partial_loc = loc.clone();
partial_loc.span = Span::new(loc.span.start(), pos);
partial_loc
}],
};
} else if pos < loc.span.start() {
break;
@ -339,7 +347,7 @@ mod tests {
line: &str,
scope: &dyn ParserScope,
pos: usize,
) -> Vec<LocationType> {
) -> Vec<CompletionLocation> {
let (tokens, _) = lex(line, 0, nu_parser::NewlineMode::Normal);
let (lite_block, _) = parse_block(tokens);
@ -348,9 +356,6 @@ mod tests {
scope.exit_scope();
super::completion_location(line, &block, pos)
.into_iter()
.map(|v| v.item)
.collect()
}
#[test]
@ -362,7 +367,7 @@ mod tests {
assert_eq!(
completion_location(line, &registry, 10),
vec![LocationType::Command],
vec![LocationType::Command.spanned(Span::new(9, 10)),],
);
}
@ -373,7 +378,7 @@ mod tests {
assert_eq!(
completion_location(line, &registry, 10),
vec![LocationType::Command],
vec![LocationType::Command.spanned(Span::new(9, 10)),],
);
}
@ -384,7 +389,7 @@ mod tests {
assert_eq!(
completion_location(line, &registry, 4),
vec![LocationType::Command],
vec![LocationType::Command.spanned(Span::new(0, 4)),],
);
}
@ -395,7 +400,7 @@ mod tests {
assert_eq!(
completion_location(line, &registry, 13),
vec![LocationType::Variable],
vec![LocationType::Variable.spanned(Span::new(5, 13)),],
);
}
@ -410,7 +415,7 @@ mod tests {
assert_eq!(
completion_location(line, &registry, 7),
vec![LocationType::Flag("du".to_string())],
vec![LocationType::Flag("du".to_string()).spanned(Span::new(3, 7)),],
);
}
@ -421,7 +426,7 @@ mod tests {
assert_eq!(
completion_location(line, &registry, 8),
vec![LocationType::Command],
vec![LocationType::Command.spanned(Span::new(6, 8)),],
);
}
@ -433,8 +438,8 @@ mod tests {
assert_eq!(
completion_location(line, &registry, 3),
vec![
LocationType::Command,
LocationType::Argument(Some("cd".to_string()), None)
LocationType::Command.spanned(Span::new(0, 3)),
LocationType::Argument(Some("cd".to_string()), None).spanned(Span::new(3, 3)),
],
);
}
@ -451,8 +456,8 @@ mod tests {
assert_eq!(
completion_location(line, &registry, 3),
vec![
LocationType::Argument(Some("du".to_string()), None),
LocationType::Flag("du".to_string()),
LocationType::Argument(Some("du".to_string()), None).spanned(Span::new(3, 4)),
LocationType::Flag("du".to_string()).spanned(Span::new(3, 4)),
],
);
}
@ -467,8 +472,24 @@ mod tests {
assert_eq!(
completion_location(line, &registry, 6),
vec![
LocationType::Command,
LocationType::Argument(Some("echo".to_string()), None)
LocationType::Command.spanned(Span::new(0, 6)),
LocationType::Argument(Some("echo".to_string()), None).spanned(Span::new(5, 6)),
],
);
}
#[test]
fn completes_argument_when_cursor_inside_argument() {
let registry: VecRegistry =
vec![Signature::build("echo").rest("rest", SyntaxShape::Any, "the values to echo")]
.into();
let line = "echo 123";
assert_eq!(
completion_location(line, &registry, 6),
vec![
LocationType::Command.spanned(Span::new(0, 6)),
LocationType::Argument(Some("echo".to_string()), None).spanned(Span::new(5, 6)),
],
);
}

Some files were not shown because too many files have changed in this diff Show More