Compare commits

...

68 Commits

Author SHA1 Message Date
JT
610e3911f6 Bump to 0.41 (#4187) 2021-12-08 06:21:00 +13:00
ee9eddd851 avoid unnecessary allocation (#4178) 2021-12-06 07:38:58 +13:00
JT
c08e145501 Fix clippy warnings (#4176) 2021-12-03 07:05:38 +13:00
c00853a473 Seems like accessing $it outside each is not possible now (#4000) 2021-12-03 06:49:24 +13:00
79c7b20cfd add login shell flag (#4175) 2021-12-02 20:05:04 +13:00
JT
89cbfd758d Remove 'arboard' (#4174) 2021-12-02 08:48:03 +13:00
e6e6b730f3 Bye bye upx sorry (#4173)
* bye bye upx, let's try stripping alone

* remove all stripping - not sure it's even working
2021-11-30 13:34:16 -06:00
0fe6a7c1b5 bye bye upx, let's try stripping alone (#4172) 2021-11-30 12:11:01 -06:00
1794ad51bd Sanitize arguments to external commands a bit better (#4157)
* fix #4140

We are passing commands into a shell underneath but we were not
escaping arguments correctly. This new version of the code also takes
into consideration the ";" and "&" characters, which have special
meaning in shells.

We would probably benefit from a more robust way to join arguments to
shell programs. Python's stdlib has shlex.join, and perhaps we can
take that implementation as a reference.

* clean up escaping of posix shell args

I believe the right place to do escaping of arguments was in the
spawn_sh_command function. Note that this change prevents things like:

^echo "$(ls)"

from executing the ls command. Instead, this will just print

$(ls)

The regex has been taken from the python stdlib implementation of shlex.quote

* fix non-literal parameters and single quotes

* address clippy's comments

* fixup! address clippy's comments

* test that subshell commands are sanitized properly
2021-11-29 09:46:42 -06:00
fb197f562a save --append: create file if it doesn't exist (#4156)
* have save --append create file if not exists

Currently, doing:

echo a | save --raw --append file.txt

will fail if file.txt does not exist. This PR changes that

* test that `save --append` will create new file
2021-11-26 12:27:41 -06:00
91c270c14a fix markup (#4155) 2021-11-26 07:37:50 -06:00
3e93ae8af4 Correct spelling (#4152) 2021-11-25 11:11:20 -06:00
e06df124ca upgrading dependencies (#4135)
* upgrade dependencies
num-bigint 0.3.1 -> 0.4.3
bigdecimal-rs 0.2.1 -> bigdecimal 0.3.0
s3hander 0.7 -> 0.7.5
bat 0.18 -> 0.18, default-features = false

* upgrade arboard 1.1.0 -> 2.0.1

* in polars use comfy-table instead of prettytable-rs
the last release of prettytable-rs was `0.8.0 Sep 27, 2018`
and it uses `term 0.5` as a dependency

* upgrade dependencies

* upgrade trash -> 2.0.1

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-20 07:11:11 -06:00
JT
2590fcbe5c Bump to 0.40 (#4129) 2021-11-16 21:53:03 +13:00
JT
09691ff866 Delete docker-publish.yml 2021-11-16 14:19:35 +13:00
16db368232 upgrade polars to 0.17 (#4122) 2021-11-16 12:01:02 +13:00
JT
df87d90b8c Add 'detect columns' command (#4127)
* Add 'detect columns' command

* Fix warnings
2021-11-16 11:29:54 +13:00
f2f01b8a4d missed from_mp4, added back (#4128) 2021-11-15 16:19:44 -06:00
6c0190cd38 added upx and strip to mac and windows (#4126) 2021-11-15 15:32:48 -06:00
b26246bf12 trying upx and strip (#4125) 2021-11-15 15:01:25 -06:00
36a4effbb2 tweaked strip ci (#4124) 2021-11-15 14:30:32 -06:00
9fca417f8c update release to allow running manually (#4123) 2021-11-15 14:04:00 -06:00
d09e1148b2 add the ability to strip the debug symbols for smaller binaries on mac and linux 2021-11-15 13:47:46 -06:00
493bc2b1c9 Update README (#4118)
`winget install nu` fails because there's other options for "nu" now.
Using the full `nushell` word solved it for me.

[Imgur](https://imgur.com/aqz2qNp)
2021-11-14 19:34:57 +13:00
74b812228c upgrade dependencies (#4116)
* remove unused dependencies

* upgrade dependency bytes 0.5.6 -> 1.1.0

* upgrade dependency heapless 0.6.1 -> 0.7.8

* upgrade dependency image 0.22.4 -> 0.23.14

* upgrade dependency mp4 0.8.2 -> 0.9.0

* upgrade dependency bson 0.14.1 -> 2.0.1

Bson::Undefined, Bson::MaxKey, Bson::MinKey and Bson::DbPointer
weren't present in the previous version.

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-14 19:32:21 +13:00
649b3804c1 fix: panic! during parsing (#4107)
Typing `selector -qa` into nu would cause a `panic!`
This was the case because the inner loop incremented the `idx`
that was only checked in the outer loop and used it to index into
`lite_cmd.parts[idx]`
With the fix we now break loop.

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-05 21:46:46 +13:00
JT
df6a53f52e Update stale.yml (#4106) 2021-11-04 21:25:44 +13:00
JT
c4af5df828 Update stale.yml (#4102) 2021-10-31 16:48:58 +13:00
f94a3e15f5 Get rid of header bold option (#4076)
* refactor(options): get rid of 'header_bold' option

* docs(config): remove 'header_bold' from docs

* fix(options): replicate logic to apply true/false in bold

* style(options): apply lint fixes
2021-10-31 06:59:19 +13:00
75782f0f50 Fix #4070: Inconsistent file matching rule for ls and rm (#4099) 2021-10-28 15:05:07 +03:00
JT
2b06ce27d3 Bump to 0.39 (#4097) 2021-10-27 08:36:41 +13:00
72c241348b Remove dependencies (#4087)
* fix regression

* Removed the nipper dependency

* fix linting

* fix clippy
2021-10-22 06:58:40 +13:00
JT
ab2d2db987 Fix clippy warnings (#4088)
* Fix clippy warnings

* Fix clippy warnings
2021-10-22 06:57:51 +13:00
07e05ef183 fix regression (#4086) 2021-10-19 13:39:23 -05:00
a986de8ad0 Update stale.yml (#4073)
add labels that can exempt from stale bot
2021-10-09 14:50:27 -05:00
22cfe4391e remove history file after clearing it (#4069) 2021-10-07 10:09:31 -05:00
JT
97d17311f4 Update LICENSE (#4067) 2021-10-07 08:42:07 +13:00
0f6fd30619 stale.yml: mention time to close in stale message (#4066) 2021-10-06 09:05:29 -05:00
JT
e1ebd461d2 Bump to 0.28 (#4064) 2021-10-06 06:35:25 +13:00
JT
f000d5d0a1 Remove the broken scrolling support (#4063)
* Remove the broken scrolling support

* Remove the broken scrolling support
2021-10-06 05:57:14 +13:00
574c5961c8 Add -c flag to select command (#4062)
See cc3653cfd9 for more on the `-c` flag.

Co-authored-by: Andrés N. Robalino <andres@androbtech.com>

Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
2021-10-05 13:23:37 +13:00
JT
69708f7244 update wasm deps (#4061) 2021-10-03 07:19:54 +13:00
62c5df5fc6 expand tilde when reading plugin_dirs (#4052) 2021-10-02 21:38:21 +13:00
92c855a412 Fixed two typos in the tutor. (#4051) 2021-10-02 21:37:59 +13:00
d395816929 remove ansi colors if this is not a tty (#4058) 2021-10-01 09:00:08 -05:00
5e34ef6dff new command: into column_path (#4048) 2021-09-29 07:23:34 -05:00
d567c58cc1 Add -c flag to update cells subcommand (#4039)
* Add `-c` flag to `update cells` subcommand

* Fix lints
2021-09-27 21:18:50 -05:00
4e0d7bc77c Less deps (#4038)
* compiles on nightly now. (breaking change)

* less deps

* Switch over to new resolver

(it's been stable for a while.)

* let's leave num-format for another PR
2021-09-28 07:17:00 +13:00
32581497ef Fix 90 degrees tables problem (#4043)
* fix 90 degrees tables problem

* linting

* clippy

* linting
2021-09-25 14:05:45 -05:00
d6df367c6b Corrected typo (#4040)
It is not BSON but SQLite
2021-09-25 04:25:00 -05:00
4e6327de1d Added BigInt handling to the delimited file format for the 'to' command (#4034)
Co-authored-by: patrick <patrick@spol42069.hitronhub.home>
2021-09-25 09:47:16 +12:00
b3d8666db0 compiles on nightly now. (breaking change) (#4037) 2021-09-25 09:46:48 +12:00
1de7c3d033 Scraping multiple tables (#4036)
* Output error when ls into a file without permission

* math sqrt

* added test to check fails when ls into prohibited dir

* fix lint

* math sqrt with tests and doc

* trigger wasm build

* Update filesystem_shell.rs

* Fix Running echo .. starts printing integers forever

* Allow for multiple table scraping

* linting

* Fix clippy

* linting

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2021-09-24 08:08:13 -05:00
962b258cc6 merge span (#4031) 2021-09-23 07:48:05 +12:00
59697cab63 force rebuild of dev container (#4033) 2021-09-23 07:47:28 +12:00
349af05da8 Do not throw error for files not found in lib_dirs (#4029) 2021-09-20 13:44:47 -05:00
JT
b3b3cf0689 Remove the docker instructions
Docker has been out of date for a long time, go ahead and remove.
2021-09-20 19:33:49 +12:00
5d59234f8d Flexibility updating table's cells. (#4027)
Very often we need to work with tables (say extracted from unstructured data or some
kind of final report, timeseries, and the like).

It's inevitable we will be having columns that we can't know beforehand what their names
will be, or how many.

Also, we may end up with certain cells having values we may want to remove as we explore.

Here, `update cells` fundamentally goes over every cell in the table coming in and updates
the cell's contents with the output of the block passed. Basic example here:

```
> [

    [   ty1,       t2,       ty];

    [     1,        a, $nothing]
    [(wrap), (0..<10),      1Mb]
    [    1s,     ({}),  1000000]
    [ $true,   $false,   ([[]])]

] | update cells { describe }

───┬───────────────────────┬───────────────────────────┬──────────
 # │          ty1          │            t2             │    ty
───┼───────────────────────┼───────────────────────────┼──────────
 0 │ integer               │ string                    │ nothing
 1 │ row Column(table of ) │ range[[integer, integer)] │ filesize
 2 │ string                │ nothing                   │ integer
 3 │ boolean               │ boolean                   │ table of
───┴───────────────────────┴───────────────────────────┴──────────
```

and another one (in the examples) for cases, say we have a timeseries table generated and
we want to remove the zeros and have empty strings and save it out to something like CSV.

```
> [
    [2021-04-16, 2021-06-10, 2021-09-18, 2021-10-15, 2021-11-16, 2021-11-17, 2021-11-18];
    [        37,          0,          0,          0,         37,          0,          0]
] | update cells {|value| i
  if ($value | into int) == 0 {
    ""
  } {
    $value
  }
}

───┬────────────┬────────────┬────────────┬────────────┬────────────┬────────────┬────────────
 # │ 2021-04-16 │ 2021-06-10 │ 2021-09-18 │ 2021-10-15 │ 2021-11-16 │ 2021-11-17 │ 2021-11-18
───┼────────────┼────────────┼────────────┼────────────┼────────────┼────────────┼────────────
 0 │         37 │            │            │            │         37 │            │
───┴────────────┴────────────┴────────────┴────────────┴────────────┴────────────┴────────────
```
2021-09-19 15:37:54 -05:00
Tw
4f7b423f36 Support completion when cursor inside an argument (#4023)
* Support completion when cursor inside an argument

Bash supports completion even when cursor is in an argument, this is very useful for some fixup after the initial completion.
Let add this feature as well.

Signed-off-by: Tw <wei.tan@intel.com>

* Add test for when cursor inside an argument

To support test this case, let's also take the position into account.

Signed-off-by: Tw <wei.tan@intel.com>
2021-09-19 17:23:05 +12:00
f7043bf690 Fix #3090: let binding in command leaks when error occurs (#4022) 2021-09-19 14:57:20 +12:00
Tw
1297499d7a add command g to switch shell quickly (#4014)
Signed-off-by: Tw <tw19881113@gmail.com>
2021-09-17 10:39:14 +01:00
bd0baa961c add table selector for downloading web tables (#4004)
* add table selector for downloading web tables

* type-o

* updated debug mode to inspect mode
2021-09-16 09:02:30 -05:00
4ee536f044 fix: enable SIMD (#4021) 2021-09-16 20:01:42 +12:00
JT
8581bec891 bump 0.37.1 (#4019) 2021-09-16 13:32:22 +12:00
8bcbc8eeb3 Move nu-path tests to integration tests (#4015)
* Move nu-path tests to integration tests

To prevent circular dependency between nu-path and nu-test-support crates.

* Fmt
2021-09-16 07:11:28 +12:00
c164ef5489 Update to polars 0.16 (#4013)
* update to polars 0.16

* enabled features for polars
2021-09-16 07:10:12 +12:00
cc3653cfd9 Path commands: Put column path args behid flag; Allow path join appending without flag (#4008)
* Change path join signature

* Appending now works without flag
* Column path operation is behind a -c flag

* Move column path arg retrieval to a function

Also improves errors

* Fix path join tests

* Propagate column path changes to all path commands

* Update path command examples with columns paths

* Modernize path command examples by removing "echo"

* Improve structured path error message

* Fix typo
2021-09-15 21:03:51 +03:00
JT
7fc65067cf Temporarily remove the circular dep (#4009) 2021-09-15 09:17:31 +12:00
168 changed files with 4059 additions and 2660 deletions

View File

@ -1,118 +0,0 @@
name: Publish consumable Docker images
on:
push:
tags: ['v?[0-9]+.[0-9]+.[0-9]+*']
jobs:
compile:
runs-on: ubuntu-latest
strategy:
matrix:
arch:
- x86_64-unknown-linux-musl
- x86_64-unknown-linux-gnu
steps:
- uses: actions/checkout@v2
- name: Install rust-embedded/cross
env: { VERSION: v0.1.16 }
run: >-
wget -nv https://github.com/rust-embedded/cross/releases/download/${VERSION}/cross-${VERSION}-x86_64-unknown-linux-gnu.tar.gz
-O- | sudo tar xz -C /usr/local/bin/
- name: compile for specific target
env: { arch: '${{ matrix.arch }}' }
run: |
cross build --target ${{ matrix.arch }} --release
# leave only the executable file
rm -frd target/${{ matrix.arch }}/release/{*/*,*.d,*.rlib,.fingerprint}
find . -empty -delete
- uses: actions/upload-artifact@master
with:
name: ${{ matrix.arch }}
path: target/${{ matrix.arch }}/release
docker:
name: Build and publish docker images
needs: compile
runs-on: ubuntu-latest
env:
DOCKER_REGISTRY: quay.io/nushell
DOCKER_PASSWORD: ${{ secrets.DOCKER_REGISTRY }}
DOCKER_USER: ${{ secrets.DOCKER_USER }}
strategy:
matrix:
tag:
- alpine
- slim
- debian
- glibc-busybox
- musl-busybox
- musl-distroless
- glibc-distroless
- glibc
- musl
include:
- { tag: alpine, base-image: alpine, arch: x86_64-unknown-linux-musl, plugin: true, use-patch: false}
- { tag: slim, base-image: 'debian:stable-slim', arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: debian, base-image: debian, arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: glibc-busybox, base-image: 'busybox:glibc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: musl-busybox, base-image: 'busybox:musl', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: musl-distroless, base-image: 'gcr.io/distroless/static', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: glibc-distroless, base-image: 'gcr.io/distroless/cc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: glibc, base-image: scratch, arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: false}
- { tag: musl, base-image: scratch, arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
steps:
- uses: actions/checkout@v2
- uses: actions/download-artifact@master
with: { name: '${{ matrix.arch }}', path: target/release }
- name: Build and publish exact version
run: |-
export DOCKER_TAG=${GITHUB_REF##*/}-${{ matrix.tag }}
export NU_BINS=target/release/$( [ ${{ matrix.plugin }} = true ] && echo nu* || echo nu )
export PATCH=$([ ${{ matrix.use-patch }} = true ] && echo .${{ matrix.tag }} || echo '')
chmod +x $NU_BINS
echo ${DOCKER_PASSWORD} | docker login ${DOCKER_REGISTRY} -u ${DOCKER_USER} --password-stdin
docker-compose --file docker/docker-compose.package.yml build
docker-compose --file docker/docker-compose.package.yml push # exact version
env:
BASE_IMAGE: ${{ matrix.base-image }}
#region semantics tagging
- name: Retag and push with suffixed version
run: |-
VERSION=${GITHUB_REF##*/}
latest_version=${VERSION%%%.*}-${{ matrix.tag }}
latest_feature=${VERSION%%.*}-${{ matrix.tag }}
latest_patch=${VERSION%.*}-${{ matrix.tag }}
exact_version=${VERSION}-${{ matrix.tag }}
tags=( ${latest_version} ${latest_feature} ${latest_patch} ${exact_version} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
docker push ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
- name: Retag and push debian as latest
if: matrix.tag == 'debian'
run: |-
VERSION=${GITHUB_REF##*/}
# ${latest features} ${latest patch} ${exact version}
tags=( ${VERSION%%.*} ${VERSION%.*} ${VERSION} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:latest
docker push ${DOCKER_REGISTRY}/nu:latest
#endregion semantics tagging

View File

@ -1,8 +1,9 @@
name: Create Release Draft name: Create Release Draft
on: on:
workflow_dispatch:
push: push:
tags: ['[0-9]+.[0-9]+.[0-9]+*'] tags: ["[0-9]+.[0-9]+.[0-9]+*"]
jobs: jobs:
linux: linux:
@ -28,6 +29,60 @@ jobs:
command: build command: build
args: --release --all --features=extra args: --release --all --features=extra
# - name: Strip binaries (nu)
# run: strip target/release/nu
# - name: Strip binaries (nu_plugin_inc)
# run: strip target/release/nu_plugin_inc
# - name: Strip binaries (nu_plugin_match)
# run: strip target/release/nu_plugin_match
# - name: Strip binaries (nu_plugin_textview)
# run: strip target/release/nu_plugin_textview
# - name: Strip binaries (nu_plugin_binaryview)
# run: strip target/release/nu_plugin_binaryview
# - name: Strip binaries (nu_plugin_chart_bar)
# run: strip target/release/nu_plugin_chart_bar
# - name: Strip binaries (nu_plugin_chart_line)
# run: strip target/release/nu_plugin_chart_line
# - name: Strip binaries (nu_plugin_from_bson)
# run: strip target/release/nu_plugin_from_bson
# - name: Strip binaries (nu_plugin_from_sqlite)
# run: strip target/release/nu_plugin_from_sqlite
# - name: Strip binaries (nu_plugin_from_mp4)
# run: strip target/release/nu_plugin_from_mp4
# - name: Strip binaries (nu_plugin_query_json)
# run: strip target/release/nu_plugin_query_json
# - name: Strip binaries (nu_plugin_s3)
# run: strip target/release/nu_plugin_s3
# - name: Strip binaries (nu_plugin_selector)
# run: strip target/release/nu_plugin_selector
# - name: Strip binaries (nu_plugin_start)
# run: strip target/release/nu_plugin_start
# - name: Strip binaries (nu_plugin_to_bson)
# run: strip target/release/nu_plugin_to_bson
# - name: Strip binaries (nu_plugin_to_sqlite)
# run: strip target/release/nu_plugin_to_sqlite
# - name: Strip binaries (nu_plugin_tree)
# run: strip target/release/nu_plugin_tree
# - name: Strip binaries (nu_plugin_xpath)
# run: strip target/release/nu_plugin_xpath
- name: Create output directory - name: Create output directory
run: mkdir output run: mkdir output
@ -70,6 +125,60 @@ jobs:
command: build command: build
args: --release --all --features=extra args: --release --all --features=extra
# - name: Strip binaries (nu)
# run: strip target/release/nu
# - name: Strip binaries (nu_plugin_inc)
# run: strip target/release/nu_plugin_inc
# - name: Strip binaries (nu_plugin_match)
# run: strip target/release/nu_plugin_match
# - name: Strip binaries (nu_plugin_textview)
# run: strip target/release/nu_plugin_textview
# - name: Strip binaries (nu_plugin_binaryview)
# run: strip target/release/nu_plugin_binaryview
# - name: Strip binaries (nu_plugin_chart_bar)
# run: strip target/release/nu_plugin_chart_bar
# - name: Strip binaries (nu_plugin_chart_line)
# run: strip target/release/nu_plugin_chart_line
# - name: Strip binaries (nu_plugin_from_bson)
# run: strip target/release/nu_plugin_from_bson
# - name: Strip binaries (nu_plugin_from_sqlite)
# run: strip target/release/nu_plugin_from_sqlite
# - name: Strip binaries (nu_plugin_from_mp4)
# run: strip target/release/nu_plugin_from_mp4
# - name: Strip binaries (nu_plugin_query_json)
# run: strip target/release/nu_plugin_query_json
# - name: Strip binaries (nu_plugin_s3)
# run: strip target/release/nu_plugin_s3
# - name: Strip binaries (nu_plugin_selector)
# run: strip target/release/nu_plugin_selector
# - name: Strip binaries (nu_plugin_start)
# run: strip target/release/nu_plugin_start
# - name: Strip binaries (nu_plugin_to_bson)
# run: strip target/release/nu_plugin_to_bson
# - name: Strip binaries (nu_plugin_to_sqlite)
# run: strip target/release/nu_plugin_to_sqlite
# - name: Strip binaries (nu_plugin_tree)
# run: strip target/release/nu_plugin_tree
# - name: Strip binaries (nu_plugin_xpath)
# run: strip target/release/nu_plugin_xpath
- name: Create output directory - name: Create output directory
run: mkdir output run: mkdir output
@ -114,6 +223,60 @@ jobs:
command: build command: build
args: --release --all --features=extra args: --release --all --features=extra
# - name: Strip binaries (nu.exe)
# run: strip target/release/nu.exe
# - name: Strip binaries (nu_plugin_inc.exe)
# run: strip target/release/nu_plugin_inc.exe
# - name: Strip binaries (nu_plugin_match.exe)
# run: strip target/release/nu_plugin_match.exe
# - name: Strip binaries (nu_plugin_textview.exe)
# run: strip target/release/nu_plugin_textview.exe
# - name: Strip binaries (nu_plugin_binaryview.exe)
# run: strip target/release/nu_plugin_binaryview.exe
# - name: Strip binaries (nu_plugin_chart_bar.exe)
# run: strip target/release/nu_plugin_chart_bar.exe
# - name: Strip binaries (nu_plugin_chart_line.exe)
# run: strip target/release/nu_plugin_chart_line.exe
# - name: Strip binaries (nu_plugin_from_bson.exe)
# run: strip target/release/nu_plugin_from_bson.exe
# - name: Strip binaries (nu_plugin_from_sqlite.exe)
# run: strip target/release/nu_plugin_from_sqlite.exe
# - name: Strip binaries (nu_plugin_from_mp4.exe)
# run: strip target/release/nu_plugin_from_mp4.exe
# - name: Strip binaries (nu_plugin_query_json.exe)
# run: strip target/release/nu_plugin_query_json.exe
# - name: Strip binaries (nu_plugin_s3.exe)
# run: strip target/release/nu_plugin_s3.exe
# - name: Strip binaries (nu_plugin_selector.exe)
# run: strip target/release/nu_plugin_selector.exe
# - name: Strip binaries (nu_plugin_start.exe)
# run: strip target/release/nu_plugin_start.exe
# - name: Strip binaries (nu_plugin_to_bson.exe)
# run: strip target/release/nu_plugin_to_bson.exe
# - name: Strip binaries (nu_plugin_to_sqlite.exe)
# run: strip target/release/nu_plugin_to_sqlite.exe
# - name: Strip binaries (nu_plugin_tree.exe)
# run: strip target/release/nu_plugin_tree.exe
# - name: Strip binaries (nu_plugin_xpath.exe)
# run: strip target/release/nu_plugin_xpath.exe
- name: Create output directory - name: Create output directory
run: mkdir output run: mkdir output

View File

@ -19,11 +19,10 @@ jobs:
operations-per-run: 520 operations-per-run: 520
enable-statistics: true enable-statistics: true
repo-token: ${{ secrets.GITHUB_TOKEN }} repo-token: ${{ secrets.GITHUB_TOKEN }}
stale-issue-message: 'This issue is being marked stale because it has been open for 90 days without activity. If you feel that this is in error, please comment below and we will keep it marked as active.' close-issue-message: 'This issue has been marked stale for more than 100000 days without activity. Closing this issue, but if you find that the issue is still valid, please reopen.'
stale-pr-message: 'This PR is being marked stale because it has been open for 45 days without activity. If this PR is still active, please comment below and we will keep it marked as active.' close-pr-message: 'This PR has been marked stale for more than 100 days without activity. Closing this PR, but if you are still working on it, please reopen.'
close-issue-message: 'This issue has been marked stale for more than 10 days without activity. Closing this issue, but if you find that the issue is still valid, please reopen.'
close-pr-message: 'This PR has been marked stale for more than 10 days without activity. Closing this PR, but if you are still working on it, please reopen.'
days-before-issue-stale: 90 days-before-issue-stale: 90
days-before-pr-stale: 45 days-before-pr-stale: 45
days-before-issue-close: 10 days-before-issue-close: 100000
days-before-pr-close: 10 days-before-pr-close: 100
exempt-issue-labels: 'exempt,keep'

2
.gitpod.Dockerfile vendored
View File

@ -2,7 +2,7 @@ FROM gitpod/workspace-full
# Gitpod will not rebuild Nushell's dev image unless *some* change is made to this Dockerfile. # Gitpod will not rebuild Nushell's dev image unless *some* change is made to this Dockerfile.
# To force a rebuild, simply increase this counter: # To force a rebuild, simply increase this counter:
ENV TRIGGER_REBUILD 1 ENV TRIGGER_REBUILD 2
USER gitpod USER gitpod

1676
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -10,7 +10,7 @@ license = "MIT"
name = "nu" name = "nu"
readme = "README.md" readme = "README.md"
repository = "https://github.com/nushell/nushell" repository = "https://github.com/nushell/nushell"
version = "0.37.0" version = "0.41.0"
[workspace] [workspace]
members = ["crates/*/"] members = ["crates/*/"]
@ -18,34 +18,34 @@ members = ["crates/*/"]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
nu-cli = { version = "0.37.0", path="./crates/nu-cli", default-features=false } nu-cli = { version = "0.41.0", path="./crates/nu-cli", default-features=false }
nu-command = { version = "0.37.0", path="./crates/nu-command" } nu-command = { version = "0.41.0", path="./crates/nu-command" }
nu-completion = { version = "0.37.0", path="./crates/nu-completion" } nu-completion = { version = "0.41.0", path="./crates/nu-completion" }
nu-data = { version = "0.37.0", path="./crates/nu-data" } nu-data = { version = "0.41.0", path="./crates/nu-data" }
nu-engine = { version = "0.37.0", path="./crates/nu-engine" } nu-engine = { version = "0.41.0", path="./crates/nu-engine" }
nu-errors = { version = "0.37.0", path="./crates/nu-errors" } nu-errors = { version = "0.41.0", path="./crates/nu-errors" }
nu-parser = { version = "0.37.0", path="./crates/nu-parser" } nu-parser = { version = "0.41.0", path="./crates/nu-parser" }
nu-path = { version = "0.37.0", path="./crates/nu-path" } nu-path = { version = "0.41.0", path="./crates/nu-path" }
nu-plugin = { version = "0.37.0", path="./crates/nu-plugin" } nu-plugin = { version = "0.41.0", path="./crates/nu-plugin" }
nu-protocol = { version = "0.37.0", path="./crates/nu-protocol" } nu-protocol = { version = "0.41.0", path="./crates/nu-protocol" }
nu-source = { version = "0.37.0", path="./crates/nu-source" } nu-source = { version = "0.41.0", path="./crates/nu-source" }
nu-value-ext = { version = "0.37.0", path="./crates/nu-value-ext" } nu-value-ext = { version = "0.41.0", path="./crates/nu-value-ext" }
nu_plugin_binaryview = { version = "0.37.0", path="./crates/nu_plugin_binaryview", optional=true } nu_plugin_binaryview = { version = "0.41.0", path="./crates/nu_plugin_binaryview", optional=true }
nu_plugin_chart = { version = "0.37.0", path="./crates/nu_plugin_chart", optional=true } nu_plugin_chart = { version = "0.41.0", path="./crates/nu_plugin_chart", optional=true }
nu_plugin_from_bson = { version = "0.37.0", path="./crates/nu_plugin_from_bson", optional=true } nu_plugin_from_bson = { version = "0.41.0", path="./crates/nu_plugin_from_bson", optional=true }
nu_plugin_from_sqlite = { version = "0.37.0", path="./crates/nu_plugin_from_sqlite", optional=true } nu_plugin_from_sqlite = { version = "0.41.0", path="./crates/nu_plugin_from_sqlite", optional=true }
nu_plugin_inc = { version = "0.37.0", path="./crates/nu_plugin_inc", optional=true } nu_plugin_inc = { version = "0.41.0", path="./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.37.0", path="./crates/nu_plugin_match", optional=true } nu_plugin_match = { version = "0.41.0", path="./crates/nu_plugin_match", optional=true }
nu_plugin_query_json = { version = "0.37.0", path="./crates/nu_plugin_query_json", optional=true } nu_plugin_query_json = { version = "0.41.0", path="./crates/nu_plugin_query_json", optional=true }
nu_plugin_s3 = { version = "0.37.0", path="./crates/nu_plugin_s3", optional=true } nu_plugin_s3 = { version = "0.41.0", path="./crates/nu_plugin_s3", optional=true }
nu_plugin_selector = { version = "0.37.0", path="./crates/nu_plugin_selector", optional=true } nu_plugin_selector = { version = "0.41.0", path="./crates/nu_plugin_selector", optional=true }
nu_plugin_start = { version = "0.37.0", path="./crates/nu_plugin_start", optional=true } nu_plugin_start = { version = "0.41.0", path="./crates/nu_plugin_start", optional=true }
nu_plugin_textview = { version = "0.37.0", path="./crates/nu_plugin_textview", optional=true } nu_plugin_textview = { version = "0.41.0", path="./crates/nu_plugin_textview", optional=true }
nu_plugin_to_bson = { version = "0.37.0", path="./crates/nu_plugin_to_bson", optional=true } nu_plugin_to_bson = { version = "0.41.0", path="./crates/nu_plugin_to_bson", optional=true }
nu_plugin_to_sqlite = { version = "0.37.0", path="./crates/nu_plugin_to_sqlite", optional=true } nu_plugin_to_sqlite = { version = "0.41.0", path="./crates/nu_plugin_to_sqlite", optional=true }
nu_plugin_tree = { version = "0.37.0", path="./crates/nu_plugin_tree", optional=true } nu_plugin_tree = { version = "0.41.0", path="./crates/nu_plugin_tree", optional=true }
nu_plugin_xpath = { version = "0.37.0", path="./crates/nu_plugin_xpath", optional=true } nu_plugin_xpath = { version = "0.41.0", path="./crates/nu_plugin_xpath", optional=true }
# Required to bootstrap the main binary # Required to bootstrap the main binary
ctrlc = { version="3.1.7", optional=true } ctrlc = { version="3.1.7", optional=true }
@ -53,7 +53,7 @@ futures = { version="0.3.12", features=["compat", "io-compat"] }
itertools = "0.10.0" itertools = "0.10.0"
[dev-dependencies] [dev-dependencies]
nu-test-support = { version = "0.37.0", path="./crates/nu-test-support" } nu-test-support = { version = "0.41.0", path="./crates/nu-test-support" }
serial_test = "0.5.1" serial_test = "0.5.1"
hamcrest2 = "0.3.0" hamcrest2 = "0.3.0"
rstest = "0.10.0" rstest = "0.10.0"
@ -89,7 +89,6 @@ extra = [
"inc", "inc",
"tree", "tree",
"textview", "textview",
"clipboard-cli",
"trash-support", "trash-support",
"uuid-support", "uuid-support",
"start", "start",
@ -113,7 +112,6 @@ textview = ["nu_plugin_textview"]
binaryview = ["nu_plugin_binaryview"] binaryview = ["nu_plugin_binaryview"]
bson = ["nu_plugin_from_bson", "nu_plugin_to_bson"] bson = ["nu_plugin_from_bson", "nu_plugin_to_bson"]
chart = ["nu_plugin_chart"] chart = ["nu_plugin_chart"]
clipboard-cli = ["nu-command/clipboard-cli"]
query-json = ["nu_plugin_query_json"] query-json = ["nu_plugin_query_json"]
s3 = ["nu_plugin_s3"] s3 = ["nu_plugin_s3"]
selector = ["nu_plugin_selector"] selector = ["nu_plugin_selector"]
@ -127,9 +125,6 @@ tree = ["nu_plugin_tree"]
xpath = ["nu_plugin_xpath"] xpath = ["nu_plugin_xpath"]
zip-support = ["nu-command/zip"] zip-support = ["nu-command/zip"]
#This is disabled in extra for now
table-pager = ["nu-command/table-pager"]
#dataframe feature for nushell #dataframe feature for nushell
dataframe = [ dataframe = [
"nu-engine/dataframe", "nu-engine/dataframe",
@ -141,7 +136,7 @@ dataframe = [
] ]
[profile.release] [profile.release]
opt-level = "z" # Optimize for size. opt-level = "s" # Optimize for size.
# Core plugins that ship with `cargo install nu` by default # Core plugins that ship with `cargo install nu` by default
# Currently, Cargo limits us to installing only one binary # Currently, Cargo limits us to installing only one binary

View File

@ -1,6 +1,6 @@
MIT License MIT License
Copyright (c) 2019 - 2021 Yehuda Katz, Jonathan Turner Copyright (c) 2019 - 2021 Nushell Project
Permission is hereby granted, free of charge, to any person obtaining a copy Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal of this software and associated documentation files (the "Software"), to deal

View File

@ -68,7 +68,7 @@ cargo install nu
To install Nu via the [Windows Package Manager](https://aka.ms/winget-cli): To install Nu via the [Windows Package Manager](https://aka.ms/winget-cli):
```shell ```shell
winget install nu winget install nushell
``` ```
You can also build Nu yourself with all the bells and whistles (be sure to have installed the [dependencies](https://www.nushell.sh/book/installation.html#dependencies) for your platform), once you have checked out this repo with git: You can also build Nu yourself with all the bells and whistles (be sure to have installed the [dependencies](https://www.nushell.sh/book/installation.html#dependencies) for your platform), once you have checked out this repo with git:
@ -76,53 +76,6 @@ You can also build Nu yourself with all the bells and whistles (be sure to have
```shell ```shell
cargo build --workspace --features=extra cargo build --workspace --features=extra
``` ```
### Docker
#### Quickstart
Want to try Nu right away? Execute the following to get started.
```shell
docker run -it quay.io/nushell/nu:latest
```
#### Guide
If you want to pull a pre-built container, you can browse tags for the [nushell organization](https://quay.io/organization/nushell)
on Quay.io. Pulling a container would come down to:
```shell
docker pull quay.io/nushell/nu
docker pull quay.io/nushell/nu-base
```
Both "nu-base" and "nu" provide the nu binary, however, nu-base also includes the source code at `/code`
in the container and all dependencies.
Optionally, you can also build the containers locally using the [dockerfiles provided](docker):
To build the base image:
```shell
docker build -f docker/Dockerfile.nu-base -t nushell/nu-base .
```
And then to build the smaller container (using a Multistage build):
```shell
docker build -f docker/Dockerfile -t nushell/nu .
```
Either way, you can run either container as follows:
```shell
docker run -it nushell/nu-base
docker run -it nushell/nu
/> exit
```
The second container is a bit smaller if the size is important to you.
### Packaging status ### Packaging status
[![Packaging status](https://repology.org/badge/vertical-allrepos/nushell.svg)](https://repology.org/project/nushell/versions) [![Packaging status](https://repology.org/badge/vertical-allrepos/nushell.svg)](https://repology.org/project/nushell/versions)

View File

@ -10,4 +10,4 @@ Foundational libraries are split into two kinds of crates:
Plugins are likewise also split into two types: Plugins are likewise also split into two types:
* Core plugins - plugins that provide part of the default experience of Nu, including access to the system properties, processes, and web-connectivity features. * Core plugins - plugins that provide part of the default experience of Nu, including access to the system properties, processes, and web-connectivity features.
* Extra plugins - these plugins run a wide range of differnt capabilities like working with different file types, charting, viewing binary data, and more. * Extra plugins - these plugins run a wide range of different capabilities like working with different file types, charting, viewing binary data, and more.

View File

@ -9,7 +9,7 @@ description = "Library for ANSI terminal colors and styles (bold, underline)"
edition = "2018" edition = "2018"
license = "MIT" license = "MIT"
name = "nu-ansi-term" name = "nu-ansi-term"
version = "0.37.0" version = "0.41.0"
[lib] [lib]
doctest = false doctest = false
@ -21,7 +21,6 @@ derive_serde_style = ["serde"]
[dependencies] [dependencies]
overload = "0.1.1" overload = "0.1.1"
serde = { version="1.0.90", features=["derive"], optional=true } serde = { version="1.0.90", features=["derive"], optional=true }
itertools = "0.10.0"
# [dependencies.serde] # [dependencies.serde]
# version = "1.0.90" # version = "1.0.90"

View File

@ -613,7 +613,7 @@ mod serde_json_tests {
let serialized = serde_json::to_string(&color).unwrap(); let serialized = serde_json::to_string(&color).unwrap();
let deserialized: Color = serde_json::from_str(&serialized).unwrap(); let deserialized: Color = serde_json::from_str(&serialized).unwrap();
assert_eq!(color, &deserialized); assert_eq!(color, deserialized);
} }
} }

View File

@ -4,23 +4,24 @@ description = "CLI for nushell"
edition = "2018" edition = "2018"
license = "MIT" license = "MIT"
name = "nu-cli" name = "nu-cli"
version = "0.37.0" version = "0.41.0"
build = "build.rs" build = "build.rs"
[lib] [lib]
doctest = false doctest = false
[dependencies] [dependencies]
nu-completion = { version = "0.37.0", path="../nu-completion" } nu-completion = { version = "0.41.0", path="../nu-completion" }
nu-command = { version = "0.37.0", path="../nu-command" } nu-command = { version = "0.41.0", path="../nu-command" }
nu-data = { version = "0.37.0", path="../nu-data" } nu-data = { version = "0.41.0", path="../nu-data" }
nu-engine = { version = "0.37.0", path="../nu-engine" } nu-engine = { version = "0.41.0", path="../nu-engine" }
nu-errors = { version = "0.37.0", path="../nu-errors" } nu-errors = { version = "0.41.0", path="../nu-errors" }
nu-parser = { version = "0.37.0", path="../nu-parser" } nu-parser = { version = "0.41.0", path="../nu-parser" }
nu-protocol = { version = "0.37.0", path="../nu-protocol" } nu-protocol = { version = "0.41.0", path="../nu-protocol" }
nu-source = { version = "0.37.0", path="../nu-source" } nu-source = { version = "0.41.0", path="../nu-source" }
nu-stream = { version = "0.37.0", path="../nu-stream" } nu-stream = { version = "0.41.0", path="../nu-stream" }
nu-ansi-term = { version = "0.37.0", path="../nu-ansi-term" } nu-ansi-term = { version = "0.41.0", path="../nu-ansi-term" }
nu-path = { version = "0.41.0", path="../nu-path" }
indexmap ="1.6.1" indexmap ="1.6.1"
log = "0.4.14" log = "0.4.14"
@ -28,13 +29,13 @@ pretty_env_logger = "0.4.0"
strip-ansi-escapes = "0.1.0" strip-ansi-escapes = "0.1.0"
rustyline = { version="9.0.0", optional=true } rustyline = { version="9.0.0", optional=true }
ctrlc = { version="3.1.7", optional=true } ctrlc = { version="3.1.7", optional=true }
shadow-rs = { version="0.6", default-features=false, optional=true } shadow-rs = { version = "0.8.1", default-features = false, optional = true }
serde = { version="1.0.123", features=["derive"] } serde = { version="1.0.123", features=["derive"] }
serde_yaml = "0.8.16" serde_yaml = "0.8.16"
lazy_static = "1.4.0" lazy_static = "1.4.0"
[build-dependencies] [build-dependencies]
shadow-rs = "0.6" shadow-rs = "0.8.1"
[features] [features]
default = ["shadow-rs"] default = ["shadow-rs"]

View File

@ -513,11 +513,6 @@ mod tests {
let args = format!("nu --loglevel={}", level); let args = format!("nu --loglevel={}", level);
ui.parse(&args)?; ui.parse(&args)?;
assert_eq!(ui.loglevel().unwrap(), Ok(level.to_string())); assert_eq!(ui.loglevel().unwrap(), Ok(level.to_string()));
let ui = cli_app();
let args = format!("nu -l {}", level);
ui.parse(&args)?;
assert_eq!(ui.loglevel().unwrap(), Ok(level.to_string()));
} }
let ui = cli_app(); let ui = cli_app();
@ -530,6 +525,17 @@ mod tests {
Ok(()) Ok(())
} }
#[test]
fn can_be_login() -> Result<(), ShellError> {
let ui = cli_app();
ui.parse("nu -l")?;
let ui = cli_app();
ui.parse("nu --login")?;
Ok(())
}
#[test] #[test]
fn can_be_passed_nu_scripts() -> Result<(), ShellError> { fn can_be_passed_nu_scripts() -> Result<(), ShellError> {
let ui = cli_app(); let ui = cli_app();

View File

@ -24,6 +24,7 @@ use rustyline::{self, error::ReadlineError};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_parser::ParserScope; use nu_parser::ParserScope;
use nu_path::expand_tilde;
use nu_protocol::{hir::ExternalRedirection, ConfigPath, UntaggedValue, Value}; use nu_protocol::{hir::ExternalRedirection, ConfigPath, UntaggedValue, Value};
use log::trace; use log::trace;
@ -54,7 +55,7 @@ pub fn search_paths() -> Vec<std::path::PathBuf> {
{ {
for pipeline in pipelines { for pipeline in pipelines {
if let Ok(plugin_dir) = pipeline.as_string() { if let Ok(plugin_dir) = pipeline.as_string() {
search_paths.push(PathBuf::from(plugin_dir)); search_paths.push(expand_tilde(plugin_dir));
} }
} }
} }
@ -371,7 +372,7 @@ pub fn cli(
LineResult::ClearHistory => { LineResult::ClearHistory => {
if options.save_history { if options.save_history {
rl.clear_history(); rl.clear_history();
let _ = rl.append_history(&history_path); std::fs::remove_file(&history_path)?;
} }
} }

View File

@ -5,54 +5,47 @@ description = "CLI for nushell"
edition = "2018" edition = "2018"
license = "MIT" license = "MIT"
name = "nu-command" name = "nu-command"
version = "0.37.0" version = "0.41.0"
[lib] [lib]
doctest = false doctest = false
[dependencies] [dependencies]
nu-data = { version = "0.37.0", path="../nu-data" } nu-data = { version = "0.41.0", path="../nu-data" }
nu-engine = { version = "0.37.0", path="../nu-engine" } nu-engine = { version = "0.41.0", path="../nu-engine" }
nu-errors = { version = "0.37.0", path="../nu-errors" } nu-errors = { version = "0.41.0", path="../nu-errors" }
nu-json = { version = "0.37.0", path="../nu-json" } nu-json = { version = "0.41.0", path="../nu-json" }
nu-path = { version = "0.37.0", path="../nu-path" } nu-path = { version = "0.41.0", path="../nu-path" }
nu-parser = { version = "0.37.0", path="../nu-parser" } nu-parser = { version = "0.41.0", path="../nu-parser" }
nu-plugin = { version = "0.37.0", path="../nu-plugin" } nu-plugin = { version = "0.41.0", path="../nu-plugin" }
nu-protocol = { version = "0.37.0", path="../nu-protocol" } nu-protocol = { version = "0.41.0", path="../nu-protocol" }
nu-serde = { version = "0.37.0", path="../nu-serde" } nu-serde = { version = "0.41.0", path="../nu-serde" }
nu-source = { version = "0.37.0", path="../nu-source" } nu-source = { version = "0.41.0", path="../nu-source" }
nu-stream = { version = "0.37.0", path="../nu-stream" } nu-stream = { version = "0.41.0", path="../nu-stream" }
nu-table = { version = "0.37.0", path="../nu-table" } nu-table = { version = "0.41.0", path="../nu-table" }
nu-test-support = { version = "0.37.0", path="../nu-test-support" } nu-test-support = { version = "0.41.0", path="../nu-test-support" }
nu-value-ext = { version = "0.37.0", path="../nu-value-ext" } nu-value-ext = { version = "0.41.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.37.0", path="../nu-ansi-term" } nu-ansi-term = { version = "0.41.0", path="../nu-ansi-term" }
nu-pretty-hex = { version = "0.37.0", path="../nu-pretty-hex" } nu-pretty-hex = { version = "0.41.0", path="../nu-pretty-hex" }
url = "2.2.1" url = "2.2.1"
mime = "0.3.16" mime = "0.3.16"
Inflector = "0.11" Inflector = "0.11"
arboard = { version="1.1.0", optional=true }
base64 = "0.13.0" base64 = "0.13.0"
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] } bigdecimal = { version = "0.3.0", features = ["serde"] }
byte-unit = "4.0.9"
bytes = "1.0.1"
calamine = "0.18.0" calamine = "0.18.0"
chrono = { version="0.4.19", features=["serde"] } chrono = { version="0.4.19", features=["serde"] }
chrono-tz = "0.5.3" chrono-tz = "0.5.3"
codespan-reporting = "0.11.0"
crossterm = { version="0.19.0", optional=true } crossterm = { version="0.19.0", optional=true }
csv = "1.1.3" csv = "1.1.3"
ctrlc = { version="3.1.7", optional=true } ctrlc = { version="3.1.7", optional=true }
derive-new = "0.5.8" derive-new = "0.5.8"
directories-next = "2.0.0"
dirs-next = "2.0.0" dirs-next = "2.0.0"
dtparse = "1.2.0" dtparse = "1.2.0"
eml-parser = "0.1.0" eml-parser = "0.1.0"
encoding_rs = "0.8.28" encoding_rs = "0.8.28"
filesize = "0.2.0" filesize = "0.2.0"
fs_extra = "1.2.0"
futures = { version="0.3.12", features=["compat", "io-compat"] } futures = { version="0.3.12", features=["compat", "io-compat"] }
getset = "0.1.1"
glob = "0.3.0" glob = "0.3.0"
htmlescape = "0.3.1" htmlescape = "0.3.1"
ical = "0.7.0" ical = "0.7.0"
@ -62,41 +55,32 @@ lazy_static = "1.*"
log = "0.4.14" log = "0.4.14"
md-5 = "0.9.1" md-5 = "0.9.1"
meval = "0.2.0" meval = "0.2.0"
minus = { version="3.4.0", optional=true, features=["async_std_lib", "search"] } num-bigint = { version="0.4.3", features=["serde"] }
num-bigint = { version="0.3.1", features=["serde"] }
num-format = { version="0.4.0", features=["with-num-bigint"] } num-format = { version="0.4.0", features=["with-num-bigint"] }
num-traits = "0.2.14" num-traits = "0.2.14"
parking_lot = "0.11.1" parking_lot = "0.11.1"
pin-utils = "0.1.0"
query_interface = "0.3.5"
quick-xml = "0.22" quick-xml = "0.22"
rand = "0.8" rand = "0.8"
rayon = "1.5.0"
regex = "1.4.3" regex = "1.4.3"
reqwest = {version = "0.11", optional = true } reqwest = {version = "0.11", optional = true }
roxmltree = "0.14.0" roxmltree = "0.14.0"
rust-embed = "5.9.0" rust-embed = "5.9.0"
rustyline = { version="9.0.0", optional=true } rustyline = { version="9.0.0", optional=true }
serde = { version="1.0.123", features=["derive"] } serde = { version="1.0.123", features=["derive"] }
serde_bytes = "0.11.5"
serde_ini = "0.2.0" serde_ini = "0.2.0"
serde_json = "1.0.61" serde_json = "1.0.61"
serde_urlencoded = "0.7.0" serde_urlencoded = "0.7.0"
serde_yaml = "0.8.16" serde_yaml = "0.8.16"
sha2 = "0.9.3" sha2 = "0.9.3"
strip-ansi-escapes = "0.1.0" strip-ansi-escapes = "0.1.0"
sxd-document = "0.3.2" sysinfo = { version = "0.21.1", optional = true }
sxd-xpath = "0.4.2"
sysinfo = { version = "0.20.2", optional = true }
thiserror = "1.0.26" thiserror = "1.0.26"
tempfile = "3.2.0"
term = { version="0.7.0", optional=true } term = { version="0.7.0", optional=true }
term_size = "0.3.2" term_size = "0.3.2"
termcolor = "1.1.2"
titlecase = "1.1.0" titlecase = "1.1.0"
tokio = { version = "1", features = ["rt-multi-thread"], optional = true } tokio = { version = "1", features = ["rt-multi-thread"], optional = true }
toml = "0.5.8" toml = "0.5.8"
trash = { version="1.3.0", optional=true } trash = { version = "2.0.2", optional = true }
unicode-segmentation = "1.8" unicode-segmentation = "1.8"
uuid_crate = { package="uuid", version="0.8.2", features=["v4"], optional=true } uuid_crate = { package="uuid", version="0.8.2", features=["v4"], optional=true }
which = { version="4.1.0", optional=true } which = { version="4.1.0", optional=true }
@ -104,9 +88,10 @@ zip = { version="0.5.9", optional=true }
digest = "0.9.0" digest = "0.9.0"
[dependencies.polars] [dependencies.polars]
version = "0.15.1" version = "0.17.0"
optional = true optional = true
features = ["parquet", "json", "random", "pivot", "strings", "is_in", "temporal"] default-features = false
features = ["docs", "zip_with", "csv-file", "temporal", "performant", "pretty_fmt", "dtype-slim", "parquet", "json", "random", "pivot", "strings", "is_in", "cum_agg", "rolling_window"]
[target.'cfg(unix)'.dependencies] [target.'cfg(unix)'.dependencies]
umask = "1.0.0" umask = "1.0.0"
@ -115,16 +100,11 @@ users = "0.11.0"
# TODO this will be possible with new dependency resolver # TODO this will be possible with new dependency resolver
# (currently on nightly behind -Zfeatures=itarget): # (currently on nightly behind -Zfeatures=itarget):
# https://github.com/rust-lang/cargo/issues/7914 # https://github.com/rust-lang/cargo/issues/7914
#[target.'cfg(not(windows))'.dependencies] # [target.'cfg(not(windows))'.dependencies]
#num-format = {version = "0.4", features = ["with-system-locale"]} # num-format = { version = "0.4", features = ["with-system-locale"] }
[dependencies.rusqlite]
features = ["bundled", "blob"]
optional = true
version = "0.25.3"
[build-dependencies] [build-dependencies]
shadow-rs = "0.6" shadow-rs = "0.8.1"
[dev-dependencies] [dev-dependencies]
quickcheck = "1.0.3" quickcheck = "1.0.3"
@ -132,11 +112,9 @@ quickcheck_macros = "1.0.0"
hamcrest2 = "0.3.0" hamcrest2 = "0.3.0"
[features] [features]
clipboard-cli = ["arboard"]
rustyline-support = ["rustyline"] rustyline-support = ["rustyline"]
stable = [] stable = []
trash-support = ["trash"] trash-support = ["trash"]
table-pager = ["minus", "crossterm"]
dataframe = ["nu-protocol/dataframe", "polars"] dataframe = ["nu-protocol/dataframe", "polars"]
fetch = ["reqwest", "tokio"] fetch = ["reqwest", "tokio"]
post = ["reqwest", "tokio"] post = ["reqwest", "tokio"]

View File

@ -1,7 +0,0 @@
use derive_new::new;
use nu_protocol::hir;
#[derive(new, Debug)]
pub(crate) struct Command {
pub(crate) args: hir::Call,
}

View File

@ -1,8 +1,10 @@
use crate::prelude::*; use crate::prelude::*;
use lazy_static::lazy_static;
use nu_engine::{evaluate_baseline_expr, BufCodecReader}; use nu_engine::{evaluate_baseline_expr, BufCodecReader};
use nu_engine::{MaybeTextCodec, StringOrBinary}; use nu_engine::{MaybeTextCodec, StringOrBinary};
use nu_test_support::NATIVE_PATH_ENV_VAR; use nu_test_support::NATIVE_PATH_ENV_VAR;
use parking_lot::Mutex; use parking_lot::Mutex;
use regex::Regex;
#[allow(unused)] #[allow(unused)]
use std::env; use std::env;
@ -44,20 +46,16 @@ pub(crate) fn run_external_command(
} }
#[allow(unused)] #[allow(unused)]
fn trim_double_quotes(input: &str) -> String { fn trim_enclosing_quotes(input: &str) -> String {
let mut chars = input.chars(); let mut chars = input.chars();
match (chars.next(), chars.next_back()) { match (chars.next(), chars.next_back()) {
(Some('"'), Some('"')) => chars.collect(), (Some('"'), Some('"')) => chars.collect(),
(Some('\''), Some('\'')) => chars.collect(),
_ => input.to_string(), _ => input.to_string(),
} }
} }
#[allow(unused)]
fn escape_where_needed(input: &str) -> String {
input.split(' ').join("\\ ").split('\'').join("\\'")
}
fn run_with_stdin( fn run_with_stdin(
command: ExternalCommand, command: ExternalCommand,
context: &mut EvaluationContext, context: &mut EvaluationContext,
@ -115,15 +113,9 @@ fn run_with_stdin(
#[cfg(not(windows))] #[cfg(not(windows))]
{ {
if !_is_literal { if !_is_literal {
let escaped = escape_double_quotes(&arg); arg
add_double_quotes(&escaped)
} else { } else {
let trimmed = trim_double_quotes(&arg); trim_enclosing_quotes(&arg)
if trimmed != arg {
escape_where_needed(&trimmed)
} else {
trimmed
}
} }
} }
#[cfg(windows)] #[cfg(windows)]
@ -131,7 +123,7 @@ fn run_with_stdin(
if let Some(unquoted) = remove_quotes(&arg) { if let Some(unquoted) = remove_quotes(&arg) {
unquoted.to_string() unquoted.to_string()
} else { } else {
arg.to_string() arg
} }
} }
}) })
@ -172,9 +164,29 @@ fn spawn_cmd_command(command: &ExternalCommand, args: &[String]) -> Command {
process process
} }
fn has_unsafe_shell_characters(arg: &str) -> bool {
lazy_static! {
static ref RE: Regex = Regex::new(r"[^\w@%+=:,./-]").expect("regex to be valid");
}
RE.is_match(arg)
}
fn shell_arg_escape(arg: &str) -> String {
match arg {
"" => String::from("''"),
s if !has_unsafe_shell_characters(s) => String::from(s),
_ => {
let single_quotes_escaped = arg.split('\'').join("'\"'\"'");
format!("'{}'", single_quotes_escaped)
}
}
}
/// Spawn a sh command with `sh -c args...` /// Spawn a sh command with `sh -c args...`
fn spawn_sh_command(command: &ExternalCommand, args: &[String]) -> Command { fn spawn_sh_command(command: &ExternalCommand, args: &[String]) -> Command {
let cmd_with_args = vec![command.name.clone(), args.join(" ")].join(" "); let joined_and_escaped_arguments = args.iter().map(|arg| shell_arg_escape(arg)).join(" ");
let cmd_with_args = vec![command.name.clone(), joined_and_escaped_arguments].join(" ");
let mut process = Command::new("sh"); let mut process = Command::new("sh");
process.arg("-c").arg(cmd_with_args); process.arg("-c").arg(cmd_with_args);
process process

View File

@ -1,5 +1 @@
mod dynamic;
pub(crate) mod external; pub(crate) mod external;
#[allow(unused_imports)]
pub(crate) use dynamic::Command as DynamicCommand;

View File

@ -38,7 +38,7 @@ impl WholeStreamCommand for SubCommand {
}, },
Example { Example {
description: "Set coloring options", description: "Set coloring options",
example: "config set color_config [[header_align header_bold]; [left $true]]", example: "config set color_config [[header_align header_color]; [left white_bold]]",
result: None, result: None,
}, },
Example { Example {

View File

@ -0,0 +1,118 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{ColumnPath, Primitive, Signature, SyntaxShape, UntaggedValue, Value};
pub struct SubCommand;
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"into column_path"
}
fn signature(&self) -> Signature {
Signature::build("into column_path").rest(
"rest",
SyntaxShape::ColumnPath,
"values to convert to column_path",
)
}
fn usage(&self) -> &str {
"Convert value to column path"
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
into_filepath(args)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Convert string to column_path in table",
example: "echo [[name]; ['/dev/null'] ['C:\\Program Files'] ['../../Cargo.toml']] | into column_path name",
result: Some(vec![
UntaggedValue::row(indexmap! {
"name".to_string() => UntaggedValue::column_path("/dev/null", Span::unknown()).into(),
})
.into(),
UntaggedValue::row(indexmap! {
"name".to_string() => UntaggedValue::column_path("C:\\Program Files", Span::unknown()).into(),
})
.into(),
UntaggedValue::row(indexmap! {
"name".to_string() => UntaggedValue::column_path("../../Cargo.toml", Span::unknown()).into(),
})
.into(),
]),
},
Example {
description: "Convert string to column_path",
example: "echo 'Cargo.toml' | into column_path",
result: Some(vec![UntaggedValue::column_path("Cargo.toml", Span::unknown()).into()]),
},
]
}
}
fn into_filepath(args: CommandArgs) -> Result<OutputStream, ShellError> {
let column_paths: Vec<ColumnPath> = args.rest(0)?;
Ok(args
.input
.map(move |v| {
if column_paths.is_empty() {
action(&v, v.tag())
} else {
let mut ret = v;
for path in &column_paths {
ret = ret.swap_data_by_column_path(
path,
Box::new(move |old| action(old, old.tag())),
)?;
}
Ok(ret)
}
})
.into_input_stream())
}
pub fn action(input: &Value, tag: impl Into<Tag>) -> Result<Value, ShellError> {
let tag = tag.into();
match &input.value {
UntaggedValue::Primitive(prim) => Ok(UntaggedValue::column_path(
match prim {
Primitive::String(a_string) => a_string,
_ => {
return Err(ShellError::unimplemented(
"'into column_path' for non-string primitives",
))
}
},
Span::unknown(),
)
.into_value(&tag)),
UntaggedValue::Row(_) => Err(ShellError::labeled_error(
"specify column name to use, with 'into column_path COLUMN'",
"found table",
tag,
)),
_ => Err(ShellError::unimplemented(
"'into column_path' for unsupported type",
)),
}
}
#[cfg(test)]
mod tests {
use super::ShellError;
use super::SubCommand;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(SubCommand {})
}
}

View File

@ -1,4 +1,5 @@
mod binary; mod binary;
mod column_path;
mod command; mod command;
mod filepath; mod filepath;
mod filesize; mod filesize;
@ -7,6 +8,7 @@ pub mod string;
pub use self::filesize::SubCommand as IntoFilesize; pub use self::filesize::SubCommand as IntoFilesize;
pub use binary::SubCommand as IntoBinary; pub use binary::SubCommand as IntoBinary;
pub use column_path::SubCommand as IntoColumnPath;
pub use command::Command as Into; pub use command::Command as Into;
pub use filepath::SubCommand as IntoFilepath; pub use filepath::SubCommand as IntoFilepath;
pub use int::SubCommand as IntoInt; pub use int::SubCommand as IntoInt;

View File

@ -101,17 +101,14 @@ fn if_command(args: CommandArgs) -> Result<OutputStream, ShellError> {
//FIXME: should we use the scope that's brought in as well? //FIXME: should we use the scope that's brought in as well?
let condition = evaluate_baseline_expr(cond, &context); let condition = evaluate_baseline_expr(cond, &context);
match condition { let result = match condition {
Ok(condition) => match condition.as_bool() { Ok(condition) => match condition.as_bool() {
Ok(b) => { Ok(b) => {
let result = if b { if b {
run_block(&then_case.block, &context, input, external_redirection) run_block(&then_case.block, &context, input, external_redirection)
} else { } else {
run_block(&else_case.block, &context, input, external_redirection) run_block(&else_case.block, &context, input, external_redirection)
}; }
context.scope.exit_scope();
result
} }
Err(e) => Ok(OutputStream::from_stream( Err(e) => Ok(OutputStream::from_stream(
vec![UntaggedValue::Error(e).into_untagged_value()].into_iter(), vec![UntaggedValue::Error(e).into_untagged_value()].into_iter(),
@ -120,13 +117,16 @@ fn if_command(args: CommandArgs) -> Result<OutputStream, ShellError> {
Err(e) => Ok(OutputStream::from_stream( Err(e) => Ok(OutputStream::from_stream(
vec![UntaggedValue::Error(e).into_untagged_value()].into_iter(), vec![UntaggedValue::Error(e).into_untagged_value()].into_iter(),
)), )),
} };
context.scope.exit_scope();
result
} }
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::If; use super::If;
use super::ShellError; use super::ShellError;
use nu_test_support::nu;
#[test] #[test]
fn examples_work_as_expected() -> Result<(), ShellError> { fn examples_work_as_expected() -> Result<(), ShellError> {
@ -134,4 +134,21 @@ mod tests {
test_examples(If {}) test_examples(If {})
} }
#[test]
fn if_doesnt_leak_on_error() {
let actual = nu!(
".",
r#"
def test-leak [] {
let var = "hello"
if 0 == "" {echo ok} {echo not}
}
test-leak
echo $var
"#
);
assert!(actual.err.contains("unknown variable"));
}
} }

View File

@ -15,6 +15,7 @@ impl WholeStreamCommand for Command {
.switch("skip-plugins", "do not load plugins", None) .switch("skip-plugins", "do not load plugins", None)
.switch("no-history", "don't save history", None) .switch("no-history", "don't save history", None)
.switch("perf", "show startup performance metrics", None) .switch("perf", "show startup performance metrics", None)
.switch("login", "start Nu as if it was a login shell", Some('l'))
.named( .named(
"commands", "commands",
SyntaxShape::String, SyntaxShape::String,
@ -33,7 +34,7 @@ impl WholeStreamCommand for Command {
"loglevel", "loglevel",
SyntaxShape::String, SyntaxShape::String,
"LEVEL: error, warn, info, debug, trace", "LEVEL: error, warn, info, debug, trace",
Some('l'), None,
) )
.named( .named(
"config-file", "config-file",

View File

@ -69,13 +69,11 @@ pub fn source(args: CommandArgs) -> Result<OutputStream, ShellError> {
for lib_path in dir { for lib_path in dir {
match lib_path { match lib_path {
Ok(name) => { Ok(name) => {
let path = canonicalize_with(&source_file, name).map_err(|e| { let path = if let Ok(p) = canonicalize_with(&source_file, name) {
ShellError::labeled_error( p
format!("Can't load source file. Reason: {}", e.to_string()), } else {
"Can't load this file", continue;
filename.span(), };
)
})?;
if let Ok(contents) = std::fs::read_to_string(path) { if let Ok(contents) = std::fs::read_to_string(path) {
let result = script::run_script_standalone(contents, true, ctx, false); let result = script::run_script_standalone(contents, true, ctx, false);

View File

@ -166,7 +166,7 @@ This will get the 3rd (note that `nth` is zero-based) row in the table created
by the `ls` command. You can use `nth` on any table created by other commands by the `ls` command. You can use `nth` on any table created by other commands
as well. as well.
You can also access the column of data in one of two ways. If you want to want You can also access the column of data in one of two ways. If you want
to keep the column as part of a new table, you can use `select`. to keep the column as part of a new table, you can use `select`.
``` ```
ls | select name ls | select name
@ -274,7 +274,7 @@ This can be helpful if you want to later processes these values.
The `echo` command can pair well with the `each` command which can run The `echo` command can pair well with the `each` command which can run
code on each row, or item, of input. code on each row, or item, of input.
You can continue to learn more about the `echo` command by running: You can continue to learn more about the `each` command by running:
``` ```
tutor each tutor each
``` ```

View File

@ -221,11 +221,6 @@ fn features_enabled() -> Vec<String> {
names.push("zip".to_string()); names.push("zip".to_string());
} }
#[cfg(feature = "clipboard-cli")]
{
names.push("clipboard-cli".to_string());
}
#[cfg(feature = "trash-support")] #[cfg(feature = "trash-support")]
{ {
names.push("trash".to_string()); names.push("trash".to_string());

View File

@ -121,7 +121,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let tail = df.as_ref().get_columns().iter().map(|col| { let tail = df.as_ref().get_columns().iter().map(|col| {
let count = col.len() as f64; let count = col.len() as f64;
let sum = match col.sum_as_series().cast_with_dtype(&DataType::Float64) { let sum = match col.sum_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) { Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v), AnyValue::Float64(v) => Some(v),
_ => None, _ => None,
@ -144,7 +144,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
_ => None, _ => None,
}; };
let min = match col.min_as_series().cast_with_dtype(&DataType::Float64) { let min = match col.min_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) { Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v), AnyValue::Float64(v) => Some(v),
_ => None, _ => None,
@ -153,7 +153,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
}; };
let q_25 = match col.quantile_as_series(0.25) { let q_25 = match col.quantile_as_series(0.25) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) { Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) { Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v), AnyValue::Float64(v) => Some(v),
_ => None, _ => None,
@ -164,7 +164,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
}; };
let q_50 = match col.quantile_as_series(0.50) { let q_50 = match col.quantile_as_series(0.50) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) { Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) { Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v), AnyValue::Float64(v) => Some(v),
_ => None, _ => None,
@ -175,7 +175,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
}; };
let q_75 = match col.quantile_as_series(0.75) { let q_75 = match col.quantile_as_series(0.75) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) { Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) { Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v), AnyValue::Float64(v) => Some(v),
_ => None, _ => None,
@ -185,7 +185,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
Err(_) => None, Err(_) => None,
}; };
let max = match col.max_as_series().cast_with_dtype(&DataType::Float64) { let max = match col.max_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) { Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v), AnyValue::Float64(v) => Some(v),
_ => None, _ => None,

View File

@ -44,6 +44,12 @@ impl WholeStreamCommand for DataFrame {
"type of join. Inner by default", "type of join. Inner by default",
Some('t'), Some('t'),
) )
.named(
"suffix",
SyntaxShape::String,
"suffix for the columns of the right dataframe",
Some('s'),
)
} }
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> { fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
@ -104,6 +110,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let r_df: Value = args.req(0)?; let r_df: Value = args.req(0)?;
let l_col: Vec<Value> = args.req_named("left")?; let l_col: Vec<Value> = args.req_named("left")?;
let r_col: Vec<Value> = args.req_named("right")?; let r_col: Vec<Value> = args.req_named("right")?;
let r_suffix: Option<Tagged<String>> = args.get_flag("suffix")?;
let join_type_op: Option<Tagged<String>> = args.get_flag("type")?; let join_type_op: Option<Tagged<String>> = args.get_flag("type")?;
let join_type = match join_type_op { let join_type = match join_type_op {
@ -124,6 +131,8 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
}, },
}; };
let suffix = r_suffix.map(|s| s.item);
let (l_col_string, l_col_span) = convert_columns(&l_col, &tag)?; let (l_col_string, l_col_span) = convert_columns(&l_col, &tag)?;
let (r_col_string, r_col_span) = convert_columns(&r_col, &tag)?; let (r_col_string, r_col_span) = convert_columns(&r_col, &tag)?;
@ -142,7 +151,13 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
)?; )?;
df.as_ref() df.as_ref()
.join(r_df.as_ref(), &l_col_string, &r_col_string, join_type) .join(
r_df.as_ref(),
&l_col_string,
&r_col_string,
join_type,
suffix,
)
.map_err(|e| parse_polars_error::<&str>(&e, &l_col_span, None)) .map_err(|e| parse_polars_error::<&str>(&e, &l_col_span, None))
} }
_ => Err(ShellError::labeled_error( _ => Err(ShellError::labeled_error(

View File

@ -8,7 +8,7 @@ use nu_protocol::{
}; };
use nu_source::Tagged; use nu_source::Tagged;
use polars::prelude::{CsvEncoding, CsvReader, JsonReader, ParquetReader, PolarsError, SerReader}; use polars::prelude::{CsvEncoding, CsvReader, JsonReader, ParquetReader, SerReader};
use std::fs::File; use std::fs::File;
pub struct DataFrame; pub struct DataFrame;
@ -206,15 +206,6 @@ fn from_csv(args: CommandArgs) -> Result<polars::prelude::DataFrame, ShellError>
match csv_reader.finish() { match csv_reader.finish() {
Ok(df) => Ok(df), Ok(df) => Ok(df),
Err(e) => match e { Err(e) => Err(parse_polars_error::<&str>(&e, &file.tag.span, None)),
PolarsError::Other(_) => Err(ShellError::labeled_error_with_secondary(
"Schema error",
"Error with the inferred schema",
&file.tag.span,
"You can use the argument 'infer_schema' with a number of rows large enough to better infer the schema",
&file.tag.span,
)),
_ => Err(parse_polars_error::<&str>(&e, &file.tag.span, None)),
},
} }
} }

View File

@ -101,9 +101,9 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let cum_type = CumType::from_str(&cum_type.item, &cum_type.tag.span)?; let cum_type = CumType::from_str(&cum_type.item, &cum_type.tag.span)?;
let mut res = match cum_type { let mut res = match cum_type {
CumType::Max => series.cum_max(reverse), CumType::Max => series.cummax(reverse),
CumType::Min => series.cum_min(reverse), CumType::Min => series.cummin(reverse),
CumType::Sum => series.cum_sum(reverse), CumType::Sum => series.cumsum(reverse),
}; };
let name = format!("{}_{}", series.name(), cum_type.to_str()); let name = format!("{}_{}", series.name(), cum_type.to_str());

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.day().into_series(); let res = casted.day().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.hour().into_series(); let res = casted.hour().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.minute().into_series(); let res = casted.minute().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.month().into_series(); let res = casted.month().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.nanosecond().into_series(); let res = casted.nanosecond().into_series();

View File

@ -56,7 +56,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.ordinal().into_series(); let res = casted.ordinal().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.second().into_series(); let res = casted.second().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.week().into_series(); let res = casted.week().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.weekday().into_series(); let res = casted.weekday().into_series();

View File

@ -56,7 +56,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.year().into_series(); let res = casted.year().into_series();

View File

@ -6,7 +6,7 @@ use nu_protocol::{
Signature, SyntaxShape, UntaggedValue, Signature, SyntaxShape, UntaggedValue,
}; };
use nu_source::Tagged; use nu_source::Tagged;
use polars::prelude::DataType; use polars::prelude::{DataType, RollingOptions};
enum RollType { enum RollType {
Min, Min,
@ -57,7 +57,6 @@ impl WholeStreamCommand for DataFrame {
Signature::build("dataframe rolling") Signature::build("dataframe rolling")
.required("type", SyntaxShape::String, "rolling operation") .required("type", SyntaxShape::String, "rolling operation")
.required("window", SyntaxShape::Int, "Window size for rolling") .required("window", SyntaxShape::Int, "Window size for rolling")
.switch("ignore_nulls", "Ignore nulls in column", Some('i'))
} }
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> { fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
@ -112,7 +111,6 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let roll_type: Tagged<String> = args.req(0)?; let roll_type: Tagged<String> = args.req(0)?;
let window_size: Tagged<i64> = args.req(1)?; let window_size: Tagged<i64> = args.req(1)?;
let ignore_nulls = args.has_flag("ignore_nulls");
let (df, df_tag) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?; let (df, df_tag) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?;
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
@ -126,31 +124,17 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
} }
let roll_type = RollType::from_str(&roll_type.item, &roll_type.tag.span)?; let roll_type = RollType::from_str(&roll_type.item, &roll_type.tag.span)?;
let rolling_opts = RollingOptions {
window_size: window_size.item as usize,
min_periods: window_size.item as usize,
weights: None,
center: false,
};
let res = match roll_type { let res = match roll_type {
RollType::Max => series.rolling_max( RollType::Max => series.rolling_max(rolling_opts),
window_size.item as u32, RollType::Min => series.rolling_min(rolling_opts),
None, RollType::Sum => series.rolling_sum(rolling_opts),
ignore_nulls, RollType::Mean => series.rolling_mean(rolling_opts),
window_size.item as u32,
),
RollType::Min => series.rolling_min(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Sum => series.rolling_sum(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Mean => series.rolling_mean(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
}; };
let mut res = res.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; let mut res = res.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;

View File

@ -78,7 +78,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let casted = match indices.dtype() { let casted = match indices.dtype() {
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => indices DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => indices
.as_ref() .as_ref()
.cast_with_dtype(&DataType::UInt32) .cast(&DataType::UInt32)
.map_err(|e| parse_polars_error::<&str>(&e, &value.tag.span, None)), .map_err(|e| parse_polars_error::<&str>(&e, &value.tag.span, None)),
_ => Err(ShellError::labeled_error_with_secondary( _ => Err(ShellError::labeled_error_with_secondary(
"Incorrect type", "Incorrect type",

View File

@ -58,7 +58,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?; let series = df.as_series(&df_tag.span)?;
let casted = series let casted = series
.date64() .datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?; .map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.strftime(&fmt.item).into_series(); let res = casted.strftime(&fmt.item).into_series();

View File

@ -92,7 +92,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let casted = match series.dtype() { let casted = match series.dtype() {
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => series DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => series
.as_ref() .as_ref()
.cast_with_dtype(&DataType::UInt32) .cast(&DataType::UInt32)
.map_err(|e| parse_polars_error::<&str>(&e, &value.tag.span, None)), .map_err(|e| parse_polars_error::<&str>(&e, &value.tag.span, None)),
_ => Err(ShellError::labeled_error_with_secondary( _ => Err(ShellError::labeled_error_with_secondary(
"Incorrect type", "Incorrect type",

View File

@ -73,9 +73,9 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let writer = CsvWriter::new(&mut file); let writer = CsvWriter::new(&mut file);
let writer = if no_header { let writer = if no_header {
writer.has_headers(false) writer.has_header(false)
} else { } else {
writer.has_headers(true) writer.has_header(true)
}; };
let writer = match delimiter { let writer = match delimiter {

View File

@ -53,13 +53,12 @@ pub(crate) fn parse_polars_error<T: AsRef<str>>(
PolarsError::DataTypeMisMatch(_) => "Data Type Mismatch", PolarsError::DataTypeMisMatch(_) => "Data Type Mismatch",
PolarsError::NotFound(_) => "Not Found", PolarsError::NotFound(_) => "Not Found",
PolarsError::ShapeMisMatch(_) => "Shape Mismatch", PolarsError::ShapeMisMatch(_) => "Shape Mismatch",
PolarsError::Other(_) => "Other", PolarsError::ComputeError(_) => "Computer error",
PolarsError::OutOfBounds(_) => "Out Of Bounds", PolarsError::OutOfBounds(_) => "Out Of Bounds",
PolarsError::NoSlice => "No Slice", PolarsError::NoSlice => "No Slice",
PolarsError::NoData(_) => "No Data", PolarsError::NoData(_) => "No Data",
PolarsError::ValueError(_) => "Value Error", PolarsError::ValueError(_) => "Value Error",
PolarsError::MemoryNotAligned => "Memory Not Aligned", PolarsError::MemoryNotAligned => "Memory Not Aligned",
PolarsError::ParquetError(_) => "Parquet Error",
PolarsError::RandError(_) => "Rand Error", PolarsError::RandError(_) => "Rand Error",
PolarsError::HasNullValues(_) => "Has Null Values", PolarsError::HasNullValues(_) => "Has Null Values",
PolarsError::UnknownSchema(_) => "Unknown Schema", PolarsError::UnknownSchema(_) => "Unknown Schema",

View File

@ -11,12 +11,6 @@ use nu_protocol::{
pub struct WithEnv; pub struct WithEnv;
#[derive(Deserialize, Debug)]
struct WithEnvArgs {
variable: Value,
block: CapturedBlock,
}
impl WholeStreamCommand for WithEnv { impl WholeStreamCommand for WithEnv {
fn name(&self) -> &str { fn name(&self) -> &str {
"with-env" "with-env"

View File

@ -36,6 +36,7 @@ mod skip;
pub(crate) mod sort_by; pub(crate) mod sort_by;
mod uniq; mod uniq;
mod update; mod update;
mod update_cells;
mod where_; mod where_;
mod wrap; mod wrap;
mod zip_; mod zip_;
@ -78,6 +79,7 @@ pub use skip::{Skip, SkipUntil, SkipWhile};
pub use sort_by::SortBy; pub use sort_by::SortBy;
pub use uniq::Uniq; pub use uniq::Uniq;
pub use update::Command as Update; pub use update::Command as Update;
pub use update_cells::SubCommand as UpdateCells;
pub use where_::Command as Where; pub use where_::Command as Where;
pub use wrap::Wrap; pub use wrap::Wrap;
pub use zip_::Command as Zip; pub use zip_::Command as Zip;

View File

@ -15,11 +15,18 @@ impl WholeStreamCommand for Command {
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("select").rest( Signature::build("select")
"rest", .named(
SyntaxShape::ColumnPath, "columns",
"the columns to select from the table", SyntaxShape::Table,
) "Optionally operate by column path",
Some('c'),
)
.rest(
"rest",
SyntaxShape::ColumnPath,
"the columns to select from the table",
)
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
@ -27,10 +34,10 @@ impl WholeStreamCommand for Command {
} }
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> { fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let columns: Vec<ColumnPath> = args.rest(0)?; let mut columns = args.rest(0)?;
columns.extend(column_paths_from_args(&args)?);
let input = args.input; let input = args.input;
let name = args.call_info.name_tag; let name = args.call_info.name_tag;
select(name, columns, input) select(name, columns, input)
} }
@ -46,10 +53,51 @@ impl WholeStreamCommand for Command {
example: "ls | select name size", example: "ls | select name size",
result: None, result: None,
}, },
Example {
description: "Select columns dynamically",
example: "[[a b]; [1 2]] | select -c [a]",
result: Some(vec![UntaggedValue::row(indexmap! {
"a".to_string() => UntaggedValue::int(1).into(),
})
.into()]),
},
] ]
} }
} }
fn column_paths_from_args(args: &CommandArgs) -> Result<Vec<ColumnPath>, ShellError> {
let column_paths: Option<Vec<Value>> = args.get_flag("columns")?;
let has_columns = column_paths.is_some();
let column_paths = match column_paths {
Some(cols) => {
let mut c = Vec::new();
for col in cols {
let colpath = ColumnPath::build(&col.convert_to_string().spanned_unknown());
if !colpath.is_empty() {
c.push(colpath)
}
}
c
}
None => Vec::new(),
};
if has_columns && column_paths.is_empty() {
let colval: Option<Value> = args.get_flag("columns")?;
let colspan = match colval {
Some(v) => v.tag.span,
None => Span::unknown(),
};
return Err(ShellError::labeled_error(
"Requires a list of columns",
"must be a list of columns",
colspan,
));
}
Ok(column_paths)
}
fn select( fn select(
name: Tag, name: Tag,
columns: Vec<ColumnPath>, columns: Vec<ColumnPath>,

View File

@ -0,0 +1,211 @@
use crate::prelude::*;
use nu_engine::run_block;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{
hir::{CapturedBlock, ExternalRedirection},
Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue, Value,
};
use std::collections::HashSet;
use std::iter::FromIterator;
pub struct SubCommand;
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"update cells"
}
fn signature(&self) -> Signature {
Signature::build("update cells")
.required(
"block",
SyntaxShape::Block,
"the block to run an update for each cell",
)
.named(
"columns",
SyntaxShape::Table,
"list of columns to update",
Some('c'),
)
}
fn usage(&self) -> &str {
"Update the table cells."
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
update_cells(args)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Update the zero value cells to empty strings.",
example: r#"[
[2021-04-16, 2021-06-10, 2021-09-18, 2021-10-15, 2021-11-16, 2021-11-17, 2021-11-18];
[ 37, 0, 0, 0, 37, 0, 0]
] | update cells {|value|
if ($value | into int) == 0 {
""
} {
$value
}
}"#,
result: Some(vec![UntaggedValue::row(indexmap! {
"2021-04-16".to_string() => UntaggedValue::int(37).into(),
"2021-06-10".to_string() => Value::from(""),
"2021-09-18".to_string() => Value::from(""),
"2021-10-15".to_string() => Value::from(""),
"2021-11-16".to_string() => UntaggedValue::int(37).into(),
"2021-11-17".to_string() => Value::from(""),
"2021-11-18".to_string() => Value::from(""),
})
.into()]),
},
Example {
description: "Update the zero value cells to empty strings in 2 last columns.",
example: r#"[
[2021-04-16, 2021-06-10, 2021-09-18, 2021-10-15, 2021-11-16, 2021-11-17, 2021-11-18];
[ 37, 0, 0, 0, 37, 0, 0]
] | update cells -c ["2021-11-18", "2021-11-17"] {|value|
if ($value | into int) == 0 {
""
} {
$value
}
}"#,
result: Some(vec![UntaggedValue::row(indexmap! {
"2021-04-16".to_string() => UntaggedValue::int(37).into(),
"2021-06-10".to_string() => UntaggedValue::int(0).into(),
"2021-09-18".to_string() => UntaggedValue::int(0).into(),
"2021-10-15".to_string() => UntaggedValue::int(0).into(),
"2021-11-16".to_string() => UntaggedValue::int(37).into(),
"2021-11-17".to_string() => Value::from(""),
"2021-11-18".to_string() => Value::from(""),
})
.into()]),
},
]
}
}
fn update_cells(args: CommandArgs) -> Result<OutputStream, ShellError> {
let context = Arc::new(args.context.clone());
let external_redirection = args.call_info.args.external_redirection;
let block: CapturedBlock = args.req(0)?;
let block = Arc::new(block);
let columns = args
.get_flag("columns")?
.map(|x: Value| HashSet::from_iter(x.table_entries().map(|val| val.convert_to_string())));
let columns = Arc::new(columns);
Ok(args
.input
.flat_map(move |input| {
let block = block.clone();
let context = context.clone();
if input.is_row() {
OutputStream::one(process_cells(
block,
columns.clone(),
context,
input,
external_redirection,
))
} else {
match process_input(block, context, input, external_redirection) {
Ok(s) => s,
Err(e) => OutputStream::one(Value::error(e)),
}
}
})
.into_output_stream())
}
pub fn process_input(
captured_block: Arc<CapturedBlock>,
context: Arc<EvaluationContext>,
input: Value,
external_redirection: ExternalRedirection,
) -> Result<OutputStream, ShellError> {
let input_clone = input.clone();
// When we process a row, we need to know whether the block wants to have the contents of the row as
// a parameter to the block (so it gets assigned to a variable that can be used inside the block) or
// if it wants the contents as as an input stream
let input_stream = if !captured_block.block.params.positional.is_empty() {
InputStream::empty()
} else {
vec![Ok(input_clone)].into_iter().into_input_stream()
};
context.scope.enter_scope();
context.scope.add_vars(&captured_block.captured.entries);
if let Some((arg, _)) = captured_block.block.params.positional.first() {
context.scope.add_var(arg.name(), input);
} else {
context.scope.add_var("$it", input);
}
let result = run_block(
&captured_block.block,
&context,
input_stream,
external_redirection,
);
context.scope.exit_scope();
result
}
pub fn process_cells(
captured_block: Arc<CapturedBlock>,
columns: Arc<Option<HashSet<String>>>,
context: Arc<EvaluationContext>,
input: Value,
external_redirection: ExternalRedirection,
) -> Value {
TaggedDictBuilder::build(input.tag(), |row| {
input.row_entries().for_each(|(column, cell_value)| {
match &*columns {
Some(col) if !col.contains(column) => {
row.insert_value(column, cell_value.clone());
return;
}
_ => {}
};
let cell_processed = process_input(
captured_block.clone(),
context.clone(),
cell_value.clone(),
external_redirection,
)
.map(|it| it.into_vec())
.map_err(Value::error);
match cell_processed {
Ok(value) => {
match value.get(0) {
Some(one) => {
row.insert_value(column, one.clone());
}
None => {
row.insert_untagged(column, UntaggedValue::nothing());
}
};
}
Err(reason) => {
row.insert_value(column, reason);
}
}
});
})
}

View File

@ -136,7 +136,8 @@ fn to_string_tagged_value(v: &Value) -> Result<String, ShellError> {
| Primitive::Boolean(_) | Primitive::Boolean(_)
| Primitive::Decimal(_) | Primitive::Decimal(_)
| Primitive::FilePath(_) | Primitive::FilePath(_)
| Primitive::Int(_), | Primitive::Int(_)
| Primitive::BigInt(_),
) => as_string(v), ) => as_string(v),
UntaggedValue::Primitive(Primitive::Date(d)) => Ok(d.to_string()), UntaggedValue::Primitive(Primitive::Date(d)) => Ok(d.to_string()),
UntaggedValue::Primitive(Primitive::Nothing) => Ok(String::new()), UntaggedValue::Primitive(Primitive::Nothing) => Ok(String::new()),

View File

@ -163,7 +163,7 @@ fn get_current_date() -> (i32, u32, u32) {
fn add_months_of_year_to_table( fn add_months_of_year_to_table(
args: &CommandArgs, args: &CommandArgs,
mut calendar_vec_deque: &mut VecDeque<Value>, calendar_vec_deque: &mut VecDeque<Value>,
tag: &Tag, tag: &Tag,
selected_year: i32, selected_year: i32,
(start_month, end_month): (u32, u32), (start_month, end_month): (u32, u32),
@ -181,7 +181,7 @@ fn add_months_of_year_to_table(
let add_month_to_table_result = add_month_to_table( let add_month_to_table_result = add_month_to_table(
args, args,
&mut calendar_vec_deque, calendar_vec_deque,
tag, tag,
selected_year, selected_year,
month_number, month_number,

View File

@ -112,7 +112,11 @@ mod tests {
fn only_examples() -> Vec<Command> { fn only_examples() -> Vec<Command> {
let mut commands = full_tests(); let mut commands = full_tests();
commands.extend([whole_stream_command(Zip), whole_stream_command(Flatten)]); commands.extend([
whole_stream_command(UpdateCells),
whole_stream_command(Zip),
whole_stream_command(Flatten),
]);
commands commands
} }

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments}; use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*; use crate::prelude::*;
use nu_engine::WholeStreamCommand; use nu_engine::WholeStreamCommand;
use nu_errors::ShellError; use nu_errors::ShellError;
@ -9,13 +9,13 @@ use std::path::Path;
pub struct PathBasename; pub struct PathBasename;
struct PathBasenameArguments { struct PathBasenameArguments {
rest: Vec<ColumnPath>, columns: Vec<ColumnPath>,
replace: Option<Tagged<String>>, replace: Option<Tagged<String>>,
} }
impl PathSubcommandArguments for PathBasenameArguments { impl PathSubcommandArguments for PathBasenameArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> { fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest &self.columns
} }
} }
@ -26,10 +26,11 @@ impl WholeStreamCommand for PathBasename {
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("path basename") Signature::build("path basename")
.rest( .named(
"rest", "columns",
SyntaxShape::ColumnPath, SyntaxShape::Table,
"Optionally operate by column path", "Optionally operate by column path",
Some('c'),
) )
.named( .named(
"replace", "replace",
@ -46,7 +47,7 @@ impl WholeStreamCommand for PathBasename {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> { fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathBasenameArguments { let cmd_args = Arc::new(PathBasenameArguments {
rest: args.rest(0)?, columns: column_paths_from_args(&args)?,
replace: args.get_flag("replace")?, replace: args.get_flag("replace")?,
}); });
@ -58,12 +59,17 @@ impl WholeStreamCommand for PathBasename {
vec![ vec![
Example { Example {
description: "Get basename of a path", description: "Get basename of a path",
example: "echo 'C:\\Users\\joe\\test.txt' | path basename", example: "'C:\\Users\\joe\\test.txt' | path basename",
result: Some(vec![Value::from("test.txt")]), result: Some(vec![Value::from("test.txt")]),
}, },
Example {
description: "Get basename of a path in a column",
example: "ls .. | path basename -c [ name ]",
result: None,
},
Example { Example {
description: "Replace basename of a path", description: "Replace basename of a path",
example: "echo 'C:\\Users\\joe\\test.txt' | path basename -r 'spam.png'", example: "'C:\\Users\\joe\\test.txt' | path basename -r 'spam.png'",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
"C:\\Users\\joe\\spam.png", "C:\\Users\\joe\\spam.png",
))]), ))]),
@ -76,12 +82,17 @@ impl WholeStreamCommand for PathBasename {
vec![ vec![
Example { Example {
description: "Get basename of a path", description: "Get basename of a path",
example: "echo '/home/joe/test.txt' | path basename", example: "'/home/joe/test.txt' | path basename",
result: Some(vec![Value::from("test.txt")]), result: Some(vec![Value::from("test.txt")]),
}, },
Example {
description: "Get basename of a path in a column",
example: "ls .. | path basename -c [ name ]",
result: None,
},
Example { Example {
description: "Replace basename of a path", description: "Replace basename of a path",
example: "echo '/home/joe/test.txt' | path basename -r 'spam.png'", example: "'/home/joe/test.txt' | path basename -r 'spam.png'",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
"/home/joe/spam.png", "/home/joe/spam.png",
))]), ))]),

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments}; use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*; use crate::prelude::*;
use nu_engine::WholeStreamCommand; use nu_engine::WholeStreamCommand;
use nu_errors::ShellError; use nu_errors::ShellError;
@ -9,14 +9,14 @@ use std::path::Path;
pub struct PathDirname; pub struct PathDirname;
struct PathDirnameArguments { struct PathDirnameArguments {
rest: Vec<ColumnPath>, columns: Vec<ColumnPath>,
replace: Option<Tagged<String>>, replace: Option<Tagged<String>>,
num_levels: Option<Tagged<u32>>, num_levels: Option<Tagged<u32>>,
} }
impl PathSubcommandArguments for PathDirnameArguments { impl PathSubcommandArguments for PathDirnameArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> { fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest &self.columns
} }
} }
@ -27,10 +27,11 @@ impl WholeStreamCommand for PathDirname {
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("path dirname") Signature::build("path dirname")
.rest( .named(
"rest", "columns",
SyntaxShape::ColumnPath, SyntaxShape::Table,
"Optionally operate by column path", "Optionally operate by column path",
Some('c'),
) )
.named( .named(
"replace", "replace",
@ -53,7 +54,7 @@ impl WholeStreamCommand for PathDirname {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> { fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathDirnameArguments { let cmd_args = Arc::new(PathDirnameArguments {
rest: args.rest(0)?, columns: column_paths_from_args(&args)?,
replace: args.get_flag("replace")?, replace: args.get_flag("replace")?,
num_levels: args.get_flag("num-levels")?, num_levels: args.get_flag("num-levels")?,
}); });
@ -66,20 +67,25 @@ impl WholeStreamCommand for PathDirname {
vec![ vec![
Example { Example {
description: "Get dirname of a path", description: "Get dirname of a path",
example: "echo 'C:\\Users\\joe\\code\\test.txt' | path dirname", example: "'C:\\Users\\joe\\code\\test.txt' | path dirname",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
"C:\\Users\\joe\\code", "C:\\Users\\joe\\code",
))]), ))]),
}, },
Example {
description: "Get dirname of a path in a column",
example: "ls ('.' | path expand) | path dirname -c [ name ]",
result: None,
},
Example { Example {
description: "Walk up two levels", description: "Walk up two levels",
example: "echo 'C:\\Users\\joe\\code\\test.txt' | path dirname -n 2", example: "'C:\\Users\\joe\\code\\test.txt' | path dirname -n 2",
result: Some(vec![Value::from(UntaggedValue::filepath("C:\\Users\\joe"))]), result: Some(vec![Value::from(UntaggedValue::filepath("C:\\Users\\joe"))]),
}, },
Example { Example {
description: "Replace the part that would be returned with a custom path", description: "Replace the part that would be returned with a custom path",
example: example:
"echo 'C:\\Users\\joe\\code\\test.txt' | path dirname -n 2 -r C:\\Users\\viking", "'C:\\Users\\joe\\code\\test.txt' | path dirname -n 2 -r C:\\Users\\viking",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
"C:\\Users\\viking\\code\\test.txt", "C:\\Users\\viking\\code\\test.txt",
))]), ))]),
@ -92,17 +98,22 @@ impl WholeStreamCommand for PathDirname {
vec![ vec![
Example { Example {
description: "Get dirname of a path", description: "Get dirname of a path",
example: "echo '/home/joe/code/test.txt' | path dirname", example: "'/home/joe/code/test.txt' | path dirname",
result: Some(vec![Value::from(UntaggedValue::filepath("/home/joe/code"))]), result: Some(vec![Value::from(UntaggedValue::filepath("/home/joe/code"))]),
}, },
Example {
description: "Get dirname of a path in a column",
example: "ls ('.' | path expand) | path dirname -c [ name ]",
result: None,
},
Example { Example {
description: "Walk up two levels", description: "Walk up two levels",
example: "echo '/home/joe/code/test.txt' | path dirname -n 2", example: "'/home/joe/code/test.txt' | path dirname -n 2",
result: Some(vec![Value::from(UntaggedValue::filepath("/home/joe"))]), result: Some(vec![Value::from(UntaggedValue::filepath("/home/joe"))]),
}, },
Example { Example {
description: "Replace the part that would be returned with a custom path", description: "Replace the part that would be returned with a custom path",
example: "echo '/home/joe/code/test.txt' | path dirname -n 2 -r /home/viking", example: "'/home/joe/code/test.txt' | path dirname -n 2 -r /home/viking",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
"/home/viking/code/test.txt", "/home/viking/code/test.txt",
))]), ))]),

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments}; use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*; use crate::prelude::*;
use nu_engine::WholeStreamCommand; use nu_engine::WholeStreamCommand;
use nu_errors::ShellError; use nu_errors::ShellError;
@ -8,12 +8,12 @@ use std::path::Path;
pub struct PathExists; pub struct PathExists;
struct PathExistsArguments { struct PathExistsArguments {
rest: Vec<ColumnPath>, columns: Vec<ColumnPath>,
} }
impl PathSubcommandArguments for PathExistsArguments { impl PathSubcommandArguments for PathExistsArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> { fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest &self.columns
} }
} }
@ -23,10 +23,11 @@ impl WholeStreamCommand for PathExists {
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("path exists").rest( Signature::build("path exists").named(
"rest", "columns",
SyntaxShape::ColumnPath, SyntaxShape::Table,
"Optionally operate by column path", "Optionally operate by column path",
Some('c'),
) )
} }
@ -37,7 +38,7 @@ impl WholeStreamCommand for PathExists {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> { fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathExistsArguments { let cmd_args = Arc::new(PathExistsArguments {
rest: args.rest(0)?, columns: column_paths_from_args(&args)?,
}); });
Ok(operate(args.input, &action, tag.span, cmd_args)) Ok(operate(args.input, &action, tag.span, cmd_args))
@ -45,20 +46,34 @@ impl WholeStreamCommand for PathExists {
#[cfg(windows)] #[cfg(windows)]
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
vec![Example { vec![
description: "Check if a file exists", Example {
example: "echo 'C:\\Users\\joe\\todo.txt' | path exists", description: "Check if a file exists",
result: Some(vec![Value::from(UntaggedValue::boolean(false))]), example: "'C:\\Users\\joe\\todo.txt' | path exists",
}] result: Some(vec![Value::from(UntaggedValue::boolean(false))]),
},
Example {
description: "Check if a file exists in a column",
example: "ls | path exists -c [ name ]",
result: None,
},
]
} }
#[cfg(not(windows))] #[cfg(not(windows))]
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
vec![Example { vec![
description: "Check if a file exists", Example {
example: "echo '/home/joe/todo.txt' | path exists", description: "Check if a file exists",
result: Some(vec![Value::from(UntaggedValue::boolean(false))]), example: "'/home/joe/todo.txt' | path exists",
}] result: Some(vec![Value::from(UntaggedValue::boolean(false))]),
},
Example {
description: "Check if a file exists in a column",
example: "ls | path exists -c [ name ]",
result: None,
},
]
} }
} }

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments}; use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*; use crate::prelude::*;
use nu_engine::WholeStreamCommand; use nu_engine::WholeStreamCommand;
use nu_errors::ShellError; use nu_errors::ShellError;
@ -11,12 +11,12 @@ pub struct PathExpand;
struct PathExpandArguments { struct PathExpandArguments {
strict: bool, strict: bool,
rest: Vec<ColumnPath>, columns: Vec<ColumnPath>,
} }
impl PathSubcommandArguments for PathExpandArguments { impl PathSubcommandArguments for PathExpandArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> { fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest &self.columns
} }
} }
@ -32,10 +32,11 @@ impl WholeStreamCommand for PathExpand {
"Throw an error if the path could not be expanded", "Throw an error if the path could not be expanded",
Some('s'), Some('s'),
) )
.rest( .named(
"rest", "columns",
SyntaxShape::ColumnPath, SyntaxShape::Table,
"Optionally operate by column path", "Optionally operate by column path",
Some('c'),
) )
} }
@ -47,7 +48,7 @@ impl WholeStreamCommand for PathExpand {
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathExpandArguments { let cmd_args = Arc::new(PathExpandArguments {
strict: args.has_flag("strict"), strict: args.has_flag("strict"),
rest: args.rest(0)?, columns: column_paths_from_args(&args)?,
}); });
Ok(operate(args.input, &action, tag.span, cmd_args)) Ok(operate(args.input, &action, tag.span, cmd_args))
@ -63,6 +64,11 @@ impl WholeStreamCommand for PathExpand {
UntaggedValue::filepath(r"C:\Users\joe\bar").into_value(Span::new(0, 25)) UntaggedValue::filepath(r"C:\Users\joe\bar").into_value(Span::new(0, 25))
]), ]),
}, },
Example {
description: "Expand a path in a column",
example: "ls | path expand -c [ name ]",
result: None,
},
Example { Example {
description: "Expand a relative path", description: "Expand a relative path",
example: r"'foo\..\bar' | path expand", example: r"'foo\..\bar' | path expand",
@ -83,6 +89,11 @@ impl WholeStreamCommand for PathExpand {
UntaggedValue::filepath("/home/joe/bar").into_value(Span::new(0, 22)) UntaggedValue::filepath("/home/joe/bar").into_value(Span::new(0, 22))
]), ]),
}, },
Example {
description: "Expand a path in a column",
example: "ls | path expand -c [ name ]",
result: None,
},
Example { Example {
description: "Expand a relative path", description: "Expand a relative path",
example: "'foo/../bar' | path expand", example: "'foo/../bar' | path expand",

View File

@ -1,4 +1,6 @@
use super::{handle_value, join_path, operate_column_paths, PathSubcommandArguments}; use super::{
column_paths_from_args, handle_value, join_path, operate_column_paths, PathSubcommandArguments,
};
use crate::prelude::*; use crate::prelude::*;
use nu_engine::WholeStreamCommand; use nu_engine::WholeStreamCommand;
use nu_errors::ShellError; use nu_errors::ShellError;
@ -9,13 +11,13 @@ use std::path::{Path, PathBuf};
pub struct PathJoin; pub struct PathJoin;
struct PathJoinArguments { struct PathJoinArguments {
rest: Vec<ColumnPath>, columns: Vec<ColumnPath>,
append: Option<Tagged<PathBuf>>, append: Option<Tagged<PathBuf>>,
} }
impl PathSubcommandArguments for PathJoinArguments { impl PathSubcommandArguments for PathJoinArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> { fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest &self.columns
} }
} }
@ -26,16 +28,16 @@ impl WholeStreamCommand for PathJoin {
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("path join") Signature::build("path join")
.rest(
"rest",
SyntaxShape::ColumnPath,
"Optionally operate by column path",
)
.named( .named(
"columns",
SyntaxShape::Table,
"Optionally operate by column path",
Some('c'),
)
.optional(
"append", "append",
SyntaxShape::FilePath, SyntaxShape::FilePath,
"Path to append to the input", "Path to append to the input",
Some('a'),
) )
} }
@ -50,9 +52,10 @@ the output of 'path parse' and 'path split' subcommands."#
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> { fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathJoinArguments { let cmd_args = Arc::new(PathJoinArguments {
rest: args.rest(0)?, columns: column_paths_from_args(&args)?,
append: args.get_flag("append")?, append: args.opt(0)?,
}); });
Ok(operate_join(args.input, &action, tag, cmd_args)) Ok(operate_join(args.input, &action, tag, cmd_args))
@ -63,21 +66,26 @@ the output of 'path parse' and 'path split' subcommands."#
vec![ vec![
Example { Example {
description: "Append a filename to a path", description: "Append a filename to a path",
example: r"echo 'C:\Users\viking' | path join -a spam.txt", example: r"'C:\Users\viking' | path join spam.txt",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
r"C:\Users\viking\spam.txt", r"C:\Users\viking\spam.txt",
))]), ))]),
}, },
Example {
description: "Append a filename to a path inside a column",
example: r"ls | path join spam.txt -c [ name ]",
result: None,
},
Example { Example {
description: "Join a list of parts into a path", description: "Join a list of parts into a path",
example: r"echo [ 'C:' '\' 'Users' 'viking' 'spam.txt' ] | path join", example: r"[ 'C:' '\' 'Users' 'viking' 'spam.txt' ] | path join",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
r"C:\Users\viking\spam.txt", r"C:\Users\viking\spam.txt",
))]), ))]),
}, },
Example { Example {
description: "Join a structured path into a path", description: "Join a structured path into a path",
example: r"echo [ [parent stem extension]; ['C:\Users\viking' 'spam' 'txt']] | path join", example: r"[ [parent stem extension]; ['C:\Users\viking' 'spam' 'txt']] | path join",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
r"C:\Users\viking\spam.txt", r"C:\Users\viking\spam.txt",
))]), ))]),
@ -90,21 +98,26 @@ the output of 'path parse' and 'path split' subcommands."#
vec![ vec![
Example { Example {
description: "Append a filename to a path", description: "Append a filename to a path",
example: r"echo '/home/viking' | path join -a spam.txt", example: r"'/home/viking' | path join spam.txt",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
r"/home/viking/spam.txt", r"/home/viking/spam.txt",
))]), ))]),
}, },
Example {
description: "Append a filename to a path inside a column",
example: r"ls | path join spam.txt -c [ name ]",
result: None,
},
Example { Example {
description: "Join a list of parts into a path", description: "Join a list of parts into a path",
example: r"echo [ '/' 'home' 'viking' 'spam.txt' ] | path join", example: r"[ '/' 'home' 'viking' 'spam.txt' ] | path join",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
r"/home/viking/spam.txt", r"/home/viking/spam.txt",
))]), ))]),
}, },
Example { Example {
description: "Join a structured path into a path", description: "Join a structured path into a path",
example: r"echo [[ parent stem extension ]; [ '/home/viking' 'spam' 'txt' ]] | path join", example: r"[[ parent stem extension ]; [ '/home/viking' 'spam' 'txt' ]] | path join",
result: Some(vec![Value::from(UntaggedValue::filepath( result: Some(vec![Value::from(UntaggedValue::filepath(
r"/home/viking/spam.txt", r"/home/viking/spam.txt",
))]), ))]),

View File

@ -61,7 +61,7 @@ fn encode_path(
ALLOWED_COLUMNS.join(", ") ALLOWED_COLUMNS.join(", ")
); );
return Err(ShellError::labeled_error_with_secondary( return Err(ShellError::labeled_error_with_secondary(
"Invalid column name", "Expected structured path table",
msg, msg,
new_span, new_span,
"originates from here", "originates from here",
@ -216,3 +216,36 @@ where
operate_column_paths(input, action, span, args) operate_column_paths(input, action, span, args)
} }
} }
fn column_paths_from_args(args: &CommandArgs) -> Result<Vec<ColumnPath>, ShellError> {
let column_paths: Option<Vec<Value>> = args.get_flag("columns")?;
let has_columns = column_paths.is_some();
let column_paths = match column_paths {
Some(cols) => {
let mut c = Vec::new();
for col in cols {
let colpath = ColumnPath::build(&col.convert_to_string().spanned_unknown());
if !colpath.is_empty() {
c.push(colpath)
}
}
c
}
None => Vec::new(),
};
if has_columns && column_paths.is_empty() {
let colval: Option<Value> = args.get_flag("columns")?;
let colspan = match colval {
Some(v) => v.tag.span,
None => Span::unknown(),
};
return Err(ShellError::labeled_error(
"Requires a list of columns",
"must be a list of columns",
colspan,
));
}
Ok(column_paths)
}

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments}; use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*; use crate::prelude::*;
use nu_engine::WholeStreamCommand; use nu_engine::WholeStreamCommand;
use nu_errors::ShellError; use nu_errors::ShellError;
@ -11,13 +11,13 @@ use std::path::Path;
pub struct PathParse; pub struct PathParse;
struct PathParseArguments { struct PathParseArguments {
rest: Vec<ColumnPath>, columns: Vec<ColumnPath>,
extension: Option<Tagged<String>>, extension: Option<Tagged<String>>,
} }
impl PathSubcommandArguments for PathParseArguments { impl PathSubcommandArguments for PathParseArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> { fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest &self.columns
} }
} }
@ -28,10 +28,11 @@ impl WholeStreamCommand for PathParse {
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("path parse") Signature::build("path parse")
.rest( .named(
"rest", "columns",
SyntaxShape::ColumnPath, SyntaxShape::Table,
"Optionally operate by column path", "Optionally operate by column path",
Some('c'),
) )
.named( .named(
"extension", "extension",
@ -53,7 +54,7 @@ On Windows, an extra 'prefix' column is added."#
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> { fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathParseArguments { let cmd_args = Arc::new(PathParseArguments {
rest: args.rest(0)?, columns: column_paths_from_args(&args)?,
extension: args.get_flag("extension")?, extension: args.get_flag("extension")?,
}); });
@ -65,22 +66,22 @@ On Windows, an extra 'prefix' column is added."#
vec![ vec![
Example { Example {
description: "Parse a single path", description: "Parse a single path",
example: r"echo 'C:\Users\viking\spam.txt' | path parse", example: r"'C:\Users\viking\spam.txt' | path parse",
result: None, result: None,
}, },
Example { Example {
description: "Replace a complex extension", description: "Replace a complex extension",
example: r"echo 'C:\Users\viking\spam.tar.gz' | path parse -e tar.gz | update extension { 'txt' }", example: r"'C:\Users\viking\spam.tar.gz' | path parse -e tar.gz | update extension { 'txt' }",
result: None, result: None,
}, },
Example { Example {
description: "Ignore the extension", description: "Ignore the extension",
example: r"echo 'C:\Users\viking.d' | path parse -e ''", example: r"'C:\Users\viking.d' | path parse -e ''",
result: None, result: None,
}, },
Example { Example {
description: "Parse all paths under the 'name' column", description: "Parse all paths under the 'name' column",
example: r"ls | path parse name", example: r"ls | path parse -c [ name ]",
result: None, result: None,
}, },
] ]
@ -91,22 +92,22 @@ On Windows, an extra 'prefix' column is added."#
vec![ vec![
Example { Example {
description: "Parse a path", description: "Parse a path",
example: r"echo '/home/viking/spam.txt' | path parse", example: r"'/home/viking/spam.txt' | path parse",
result: None, result: None,
}, },
Example { Example {
description: "Replace a complex extension", description: "Replace a complex extension",
example: r"echo '/home/viking/spam.tar.gz' | path parse -e tar.gz | update extension { 'txt' }", example: r"'/home/viking/spam.tar.gz' | path parse -e tar.gz | update extension { 'txt' }",
result: None, result: None,
}, },
Example { Example {
description: "Ignore the extension", description: "Ignore the extension",
example: r"echo '/etc/conf.d' | path parse -e ''", example: r"'/etc/conf.d' | path parse -e ''",
result: None, result: None,
}, },
Example { Example {
description: "Parse all paths under the 'name' column", description: "Parse all paths under the 'name' column",
example: r"ls | path parse name", example: r"ls | path parse -c [ name ]",
result: None, result: None,
}, },
] ]

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments}; use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*; use crate::prelude::*;
use nu_engine::WholeStreamCommand; use nu_engine::WholeStreamCommand;
use nu_errors::ShellError; use nu_errors::ShellError;
@ -10,12 +10,12 @@ pub struct PathRelativeTo;
struct PathRelativeToArguments { struct PathRelativeToArguments {
path: Tagged<PathBuf>, path: Tagged<PathBuf>,
rest: Vec<ColumnPath>, columns: Vec<ColumnPath>,
} }
impl PathSubcommandArguments for PathRelativeToArguments { impl PathSubcommandArguments for PathRelativeToArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> { fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest &self.columns
} }
} }
@ -31,10 +31,11 @@ impl WholeStreamCommand for PathRelativeTo {
SyntaxShape::FilePath, SyntaxShape::FilePath,
"Parent shared with the input path", "Parent shared with the input path",
) )
.rest( .named(
"rest", "columns",
SyntaxShape::ColumnPath, SyntaxShape::Table,
"Optionally operate by column path", "Optionally operate by column path",
Some('c'),
) )
} }
@ -52,7 +53,7 @@ path."#
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathRelativeToArguments { let cmd_args = Arc::new(PathRelativeToArguments {
path: args.req(0)?, path: args.req(0)?,
rest: args.rest(1)?, columns: column_paths_from_args(&args)?,
}); });
Ok(operate(args.input, &action, tag.span, cmd_args)) Ok(operate(args.input, &action, tag.span, cmd_args))
@ -66,6 +67,11 @@ path."#
example: r"'C:\Users\viking' | path relative-to 'C:\Users'", example: r"'C:\Users\viking' | path relative-to 'C:\Users'",
result: Some(vec![Value::from(UntaggedValue::filepath(r"viking"))]), result: Some(vec![Value::from(UntaggedValue::filepath(r"viking"))]),
}, },
Example {
description: "Find a relative path from two absolute paths in a column",
example: "ls ~ | path relative-to ~ -c [ name ]",
result: None,
},
Example { Example {
description: "Find a relative path from two relative paths", description: "Find a relative path from two relative paths",
example: r"'eggs\bacon\sausage\spam' | path relative-to 'eggs\bacon\sausage'", example: r"'eggs\bacon\sausage\spam' | path relative-to 'eggs\bacon\sausage'",
@ -82,6 +88,11 @@ path."#
example: r"'/home/viking' | path relative-to '/home'", example: r"'/home/viking' | path relative-to '/home'",
result: Some(vec![Value::from(UntaggedValue::filepath(r"viking"))]), result: Some(vec![Value::from(UntaggedValue::filepath(r"viking"))]),
}, },
Example {
description: "Find a relative path from two absolute paths in a column",
example: "ls ~ | path relative-to ~ -c [ name ]",
result: None,
},
Example { Example {
description: "Find a relative path from two relative paths", description: "Find a relative path from two relative paths",
example: r"'eggs/bacon/sausage/spam' | path relative-to 'eggs/bacon/sausage'", example: r"'eggs/bacon/sausage/spam' | path relative-to 'eggs/bacon/sausage'",

View File

@ -1,4 +1,4 @@
use super::{handle_value, operate_column_paths, PathSubcommandArguments}; use super::{column_paths_from_args, handle_value, operate_column_paths, PathSubcommandArguments};
use crate::prelude::*; use crate::prelude::*;
use nu_engine::WholeStreamCommand; use nu_engine::WholeStreamCommand;
use nu_errors::ShellError; use nu_errors::ShellError;
@ -8,12 +8,12 @@ use std::path::Path;
pub struct PathSplit; pub struct PathSplit;
struct PathSplitArguments { struct PathSplitArguments {
rest: Vec<ColumnPath>, columns: Vec<ColumnPath>,
} }
impl PathSubcommandArguments for PathSplitArguments { impl PathSubcommandArguments for PathSplitArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> { fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest &self.columns
} }
} }
@ -23,10 +23,11 @@ impl WholeStreamCommand for PathSplit {
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("path split").rest( Signature::build("path split").named(
"rest", "columns",
SyntaxShape::ColumnPath, SyntaxShape::Table,
"Optionally operate by column path", "Optionally operate by column path",
Some('c'),
) )
} }
@ -37,7 +38,7 @@ impl WholeStreamCommand for PathSplit {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> { fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathSplitArguments { let cmd_args = Arc::new(PathSplitArguments {
rest: args.rest(0)?, columns: column_paths_from_args(&args)?,
}); });
Ok(operate_split(args.input, &action, tag.span, cmd_args)) Ok(operate_split(args.input, &action, tag.span, cmd_args))
@ -48,7 +49,7 @@ impl WholeStreamCommand for PathSplit {
vec![ vec![
Example { Example {
description: "Split a path into parts", description: "Split a path into parts",
example: r"echo 'C:\Users\viking\spam.txt' | path split", example: r"'C:\Users\viking\spam.txt' | path split",
result: Some(vec![ result: Some(vec![
Value::from(UntaggedValue::string("C:")), Value::from(UntaggedValue::string("C:")),
Value::from(UntaggedValue::string(r"\")), Value::from(UntaggedValue::string(r"\")),
@ -59,7 +60,7 @@ impl WholeStreamCommand for PathSplit {
}, },
Example { Example {
description: "Split all paths under the 'name' column", description: "Split all paths under the 'name' column",
example: r"ls | path split name", example: r"ls ('.' | path expand) | path split -c [ name ]",
result: None, result: None,
}, },
] ]
@ -70,7 +71,7 @@ impl WholeStreamCommand for PathSplit {
vec![ vec![
Example { Example {
description: "Split a path into parts", description: "Split a path into parts",
example: r"echo '/home/viking/spam.txt' | path split", example: r"'/home/viking/spam.txt' | path split",
result: Some(vec![ result: Some(vec![
Value::from(UntaggedValue::string("/")), Value::from(UntaggedValue::string("/")),
Value::from(UntaggedValue::string("home")), Value::from(UntaggedValue::string("home")),
@ -80,7 +81,7 @@ impl WholeStreamCommand for PathSplit {
}, },
Example { Example {
description: "Split all paths under the 'name' column", description: "Split all paths under the 'name' column",
example: r"ls | path split name", example: r"ls ('.' | path expand) | path split -c [ name ]",
result: None, result: None,
}, },
] ]

View File

@ -1,4 +1,4 @@
use super::{operate, PathSubcommandArguments}; use super::{column_paths_from_args, operate, PathSubcommandArguments};
use crate::prelude::*; use crate::prelude::*;
use nu_engine::filesystem::filesystem_shell::get_file_type; use nu_engine::filesystem::filesystem_shell::get_file_type;
use nu_engine::WholeStreamCommand; use nu_engine::WholeStreamCommand;
@ -9,12 +9,12 @@ use std::path::Path;
pub struct PathType; pub struct PathType;
struct PathTypeArguments { struct PathTypeArguments {
rest: Vec<ColumnPath>, columns: Vec<ColumnPath>,
} }
impl PathSubcommandArguments for PathTypeArguments { impl PathSubcommandArguments for PathTypeArguments {
fn get_column_paths(&self) -> &Vec<ColumnPath> { fn get_column_paths(&self) -> &Vec<ColumnPath> {
&self.rest &self.columns
} }
} }
@ -24,10 +24,11 @@ impl WholeStreamCommand for PathType {
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("path type").rest( Signature::build("path type").named(
"rest", "columns",
SyntaxShape::ColumnPath, SyntaxShape::Table,
"Optionally operate by column path", "Optionally operate by column path",
Some('c'),
) )
} }
@ -38,18 +39,25 @@ impl WholeStreamCommand for PathType {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> { fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let cmd_args = Arc::new(PathTypeArguments { let cmd_args = Arc::new(PathTypeArguments {
rest: args.rest(0)?, columns: column_paths_from_args(&args)?,
}); });
Ok(operate(args.input, &action, tag.span, cmd_args)) Ok(operate(args.input, &action, tag.span, cmd_args))
} }
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
vec![Example { vec![
description: "Show type of a filepath", Example {
example: "echo '.' | path type", description: "Show type of a filepath",
result: Some(vec![Value::from("Dir")]), example: "'.' | path type",
}] result: Some(vec![Value::from("Dir")]),
},
Example {
description: "Show type of a filepath in a column",
example: "ls | path type -c [ name ]",
result: None,
},
]
} }
} }

View File

@ -170,7 +170,7 @@ fn action(
Ok(UntaggedValue::string(gradient_string).into_value(tag)) Ok(UntaggedValue::string(gradient_string).into_value(tag))
} }
(None, Some(fg_end), None, Some(bg_end)) => { (None, Some(fg_end), None, Some(bg_end)) => {
// missin fg_start and bg_start, so assume black // missing fg_start and bg_start, so assume black
let fg_start = Rgb::new(0, 0, 0); let fg_start = Rgb::new(0, 0, 0);
let bg_start = Rgb::new(0, 0, 0); let bg_start = Rgb::new(0, 0, 0);
let fg_gradient = Gradient::new(fg_start, fg_end); let fg_gradient = Gradient::new(fg_start, fg_end);

View File

@ -1,105 +0,0 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, Value};
use arboard::Clipboard;
pub struct Clip;
impl WholeStreamCommand for Clip {
fn name(&self) -> &str {
"clip"
}
fn signature(&self) -> Signature {
Signature::build("clip")
}
fn usage(&self) -> &str {
"Copy the contents of the pipeline to the copy/paste buffer."
}
fn run_with_actions(&self, args: CommandArgs) -> Result<ActionStream, ShellError> {
clip(args)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Save text to the clipboard",
example: "echo 'secret value' | clip",
result: None,
},
Example {
description: "Save numbers to the clipboard",
example: "random integer 10000000..99999999 | clip",
result: None,
},
]
}
}
pub fn clip(args: CommandArgs) -> Result<ActionStream, ShellError> {
let input = args.input;
let name = args.call_info.name_tag;
let values: Vec<Value> = input.collect();
if let Ok(mut clip_context) = Clipboard::new() {
let mut new_copy_data = String::new();
if !values.is_empty() {
let mut first = true;
for i in &values {
if !first {
new_copy_data.push('\n');
} else {
first = false;
}
let string: String = i.convert_to_string();
if string.is_empty() {
return Err(ShellError::labeled_error(
"Unable to convert to string",
"Unable to convert to string",
name,
));
}
new_copy_data.push_str(&string);
}
}
match clip_context.set_text(new_copy_data) {
Ok(_) => {}
Err(_) => {
return Err(ShellError::labeled_error(
"Could not set contents of clipboard",
"could not set contents of clipboard",
name,
));
}
}
} else {
return Err(ShellError::labeled_error(
"Could not open clipboard",
"could not open clipboard",
name,
));
}
Ok(ActionStream::empty())
}
#[cfg(test)]
mod tests {
use super::Clip;
use super::ShellError;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(Clip {})
}
}

View File

@ -1,13 +1,9 @@
mod ansi; mod ansi;
mod benchmark; mod benchmark;
mod clear; mod clear;
#[cfg(feature = "clipboard-cli")]
mod clip;
mod du; mod du;
mod exec; mod exec;
mod kill; mod kill;
#[cfg(feature = "clipboard-cli")]
mod paste;
mod pwd; mod pwd;
mod run_external; mod run_external;
mod sleep; mod sleep;
@ -17,13 +13,9 @@ mod which_;
pub use ansi::*; pub use ansi::*;
pub use benchmark::Benchmark; pub use benchmark::Benchmark;
pub use clear::Clear; pub use clear::Clear;
#[cfg(feature = "clipboard-cli")]
pub use clip::Clip;
pub use du::Du; pub use du::Du;
pub use exec::Exec; pub use exec::Exec;
pub use kill::Kill; pub use kill::Kill;
#[cfg(feature = "clipboard-cli")]
pub use paste::Paste;
pub use pwd::Pwd; pub use pwd::Pwd;
pub use run_external::RunExternalCommand; pub use run_external::RunExternalCommand;
pub use sleep::Sleep; pub use sleep::Sleep;

View File

@ -1,61 +0,0 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue};
use arboard::Clipboard;
pub struct Paste;
impl WholeStreamCommand for Paste {
fn name(&self) -> &str {
"paste"
}
fn signature(&self) -> Signature {
Signature::build("paste")
}
fn usage(&self) -> &str {
"Paste contents from the clipboard"
}
fn run_with_actions(&self, args: CommandArgs) -> Result<ActionStream, ShellError> {
paste(args)
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Paste text from your clipboard",
example: "echo 'secret value' | clip | paste",
result: Some(vec![UntaggedValue::Primitive(Primitive::String(
"secret value".to_owned(),
))
.into_value(Tag::default())]),
}]
}
}
pub fn paste(args: CommandArgs) -> Result<ActionStream, ShellError> {
let name = args.call_info.name_tag;
if let Ok(mut clip_context) = Clipboard::new() {
match clip_context.get_text() {
Ok(out) => Ok(ActionStream::one(ReturnSuccess::value(
UntaggedValue::Primitive(Primitive::String(out)),
))),
Err(_) => Err(ShellError::labeled_error(
"Could not get contents of clipboard",
"could not get contents of clipboard",
name,
)),
}
} else {
Err(ShellError::labeled_error(
"Could not open clipboard",
"could not open clipboard",
name,
))
}
}

View File

@ -6,12 +6,6 @@ use nu_protocol::{Dictionary, Signature, UntaggedValue};
pub struct TermSize; pub struct TermSize;
#[derive(Deserialize, Clone)]
pub struct TermSizeArgs {
wide: bool,
tall: bool,
}
impl WholeStreamCommand for TermSize { impl WholeStreamCommand for TermSize {
fn name(&self) -> &str { fn name(&self) -> &str {
"term size" "term size"

View File

@ -0,0 +1,51 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{CommandAction, ReturnSuccess, Signature, SyntaxShape};
pub struct Goto;
impl WholeStreamCommand for Goto {
fn name(&self) -> &str {
"g"
}
fn signature(&self) -> Signature {
Signature::build("g").required("index", SyntaxShape::Int, "the shell's index to go to")
}
fn usage(&self) -> &str {
"Go to specified shell."
}
fn run_with_actions(&self, args: CommandArgs) -> Result<ActionStream, ShellError> {
goto(args)
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Enter the first shell",
example: "g 0",
result: None,
}]
}
}
fn goto(args: CommandArgs) -> Result<ActionStream, ShellError> {
Ok(ActionStream::one(ReturnSuccess::action(
CommandAction::GotoShell(args.req(0)?),
)))
}
#[cfg(test)]
mod tests {
use super::Goto;
use super::ShellError;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(Goto {})
}
}

View File

@ -1,11 +1,13 @@
mod command; mod command;
mod enter; mod enter;
mod exit; mod exit;
mod goto;
mod next; mod next;
mod prev; mod prev;
pub use command::Shells; pub use command::Shells;
pub use enter::Enter; pub use enter::Enter;
pub use exit::Exit; pub use exit::Exit;
pub use goto::Goto;
pub use next::Next; pub use next::Next;
pub use prev::Previous; pub use prev::Previous;

View File

@ -0,0 +1,283 @@
use std::{iter::Peekable, str::CharIndices};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue};
use nu_source::Spanned;
type Input<'t> = Peekable<CharIndices<'t>>;
pub struct DetectColumns;
impl WholeStreamCommand for DetectColumns {
fn name(&self) -> &str {
"detect columns"
}
fn signature(&self) -> Signature {
Signature::build("detect columns")
.named(
"skip",
SyntaxShape::Int,
"number of rows to skip before detecting",
Some('s'),
)
.switch("no_headers", "don't detect headers", Some('n'))
}
fn usage(&self) -> &str {
"splits contents across multiple columns via the separator."
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
detect_columns(args)
}
}
fn detect_columns(args: CommandArgs) -> Result<OutputStream, ShellError> {
let name_tag = args.name_tag();
let num_rows_to_skip: Option<usize> = args.get_flag("skip")?;
let noheader = args.has_flag("no_headers");
let input = args.input.collect_string(name_tag.clone())?;
let input: Vec<_> = input
.lines()
.skip(num_rows_to_skip.unwrap_or_default())
.map(|x| x.to_string())
.collect();
let mut input = input.into_iter();
let headers = input.next();
if let Some(orig_headers) = headers {
let headers = find_columns(&orig_headers);
Ok((if noheader {
vec![orig_headers].into_iter().chain(input)
} else {
vec![].into_iter().chain(input)
})
.map(move |x| {
let row = find_columns(&x);
let mut dict = TaggedDictBuilder::new(name_tag.clone());
if headers.len() == row.len() && !noheader {
for (header, val) in headers.iter().zip(row.iter()) {
dict.insert_untagged(&header.item, UntaggedValue::string(&val.item));
}
} else {
let mut pre_output = vec![];
// column counts don't line up, so see if we can figure out why
for cell in row {
for header in &headers {
if cell.span.start() <= header.span.end()
&& cell.span.end() > header.span.start()
{
pre_output
.push((header.item.to_string(), UntaggedValue::string(&cell.item)));
}
}
}
for header in &headers {
let mut found = false;
for pre_o in &pre_output {
if pre_o.0 == header.item {
found = true;
break;
}
}
if !found {
pre_output.push((header.item.to_string(), UntaggedValue::nothing()));
}
}
if noheader {
for header in headers.iter().enumerate() {
for pre_o in &pre_output {
if pre_o.0 == header.1.item {
dict.insert_untagged(format!("Column{}", header.0), pre_o.1.clone())
}
}
}
} else {
for header in &headers {
for pre_o in &pre_output {
if pre_o.0 == header.item {
dict.insert_untagged(&header.item, pre_o.1.clone())
}
}
}
}
}
dict.into_value()
})
.into_output_stream())
} else {
Ok(OutputStream::empty())
}
}
pub fn find_columns(input: &str) -> Vec<Spanned<String>> {
let mut chars = input.char_indices().peekable();
let mut output = vec![];
while let Some((_, c)) = chars.peek() {
if c.is_whitespace() {
// If the next character is non-newline whitespace, skip it.
let _ = chars.next();
} else {
// Otherwise, try to consume an unclassified token.
let result = baseline(&mut chars);
output.push(result);
}
}
output
}
#[derive(Clone, Copy)]
enum BlockKind {
Paren,
CurlyBracket,
SquareBracket,
}
fn baseline(src: &mut Input) -> Spanned<String> {
let mut token_contents = String::new();
let start_offset = if let Some((pos, _)) = src.peek() {
*pos
} else {
0
};
// This variable tracks the starting character of a string literal, so that
// we remain inside the string literal lexer mode until we encounter the
// closing quote.
let mut quote_start: Option<char> = None;
// This Vec tracks paired delimiters
let mut block_level: Vec<BlockKind> = vec![];
// A baseline token is terminated if it's not nested inside of a paired
// delimiter and the next character is one of: `|`, `;`, `#` or any
// whitespace.
fn is_termination(block_level: &[BlockKind], c: char) -> bool {
block_level.is_empty() && (c.is_whitespace())
}
// The process of slurping up a baseline token repeats:
//
// - String literal, which begins with `'`, `"` or `\``, and continues until
// the same character is encountered again.
// - Delimiter pair, which begins with `[`, `(`, or `{`, and continues until
// the matching closing delimiter is found, skipping comments and string
// literals.
// - When not nested inside of a delimiter pair, when a terminating
// character (whitespace, `|`, `;` or `#`) is encountered, the baseline
// token is done.
// - Otherwise, accumulate the character into the current baseline token.
while let Some((_, c)) = src.peek() {
let c = *c;
if quote_start.is_some() {
// If we encountered the closing quote character for the current
// string, we're done with the current string.
if Some(c) == quote_start {
quote_start = None;
}
} else if c == '\n' {
if is_termination(&block_level, c) {
break;
}
} else if c == '\'' || c == '"' || c == '`' {
// We encountered the opening quote of a string literal.
quote_start = Some(c);
} else if c == '[' {
// We encountered an opening `[` delimiter.
block_level.push(BlockKind::SquareBracket);
} else if c == ']' {
// We encountered a closing `]` delimiter. Pop off the opening `[`
// delimiter.
if let Some(BlockKind::SquareBracket) = block_level.last() {
let _ = block_level.pop();
}
} else if c == '{' {
// We encountered an opening `{` delimiter.
block_level.push(BlockKind::CurlyBracket);
} else if c == '}' {
// We encountered a closing `}` delimiter. Pop off the opening `{`.
if let Some(BlockKind::CurlyBracket) = block_level.last() {
let _ = block_level.pop();
}
} else if c == '(' {
// We enceountered an opening `(` delimiter.
block_level.push(BlockKind::Paren);
} else if c == ')' {
// We encountered a closing `)` delimiter. Pop off the opening `(`.
if let Some(BlockKind::Paren) = block_level.last() {
let _ = block_level.pop();
}
} else if is_termination(&block_level, c) {
break;
}
// Otherwise, accumulate the character into the current token.
token_contents.push(c);
// Consume the character.
let _ = src.next();
}
let span = Span::new(start_offset, start_offset + token_contents.len());
// If there is still unclosed opening delimiters, close them and add
// synthetic closing characters to the accumulated token.
if block_level.last().is_some() {
// let delim: char = (*block).closing();
// let cause = ParseError::unexpected_eof(delim.to_string(), span);
// while let Some(bk) = block_level.pop() {
// token_contents.push(bk.closing());
// }
return token_contents.spanned(span);
}
if quote_start.is_some() {
// The non-lite parse trims quotes on both sides, so we add the expected quote so that
// anyone wanting to consume this partial parse (e.g., completions) will be able to get
// correct information from the non-lite parse.
// token_contents.push(delimiter);
// return (
// token_contents.spanned(span),
// Some(ParseError::unexpected_eof(delimiter.to_string(), span)),
// );
return token_contents.spanned(span);
}
token_contents.spanned(span)
}
#[cfg(test)]
mod tests {
use super::DetectColumns;
use super::ShellError;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(DetectColumns {})
}
}

View File

@ -0,0 +1,3 @@
pub mod columns;
pub use columns::DetectColumns;

View File

@ -1,5 +1,6 @@
mod build_string; mod build_string;
mod char_; mod char_;
mod detect;
mod format; mod format;
mod lines; mod lines;
mod parse; mod parse;
@ -10,6 +11,7 @@ mod str_;
pub use build_string::BuildString; pub use build_string::BuildString;
pub use char_::Char; pub use char_::Char;
pub use detect::DetectColumns;
pub use format::*; pub use format::*;
pub use lines::Lines; pub use lines::Lines;
pub use parse::*; pub use parse::*;

View File

@ -144,7 +144,7 @@ fn trim(s: &str, char_: Option<char>, closure_flags: &ClosureFlags) -> String {
let re_str = format!("{}{{2,}}", reg); let re_str = format!("{}{{2,}}", reg);
// create the regex // create the regex
let re = regex::Regex::new(&re_str).expect("Error creating regular expression"); let re = regex::Regex::new(&re_str).expect("Error creating regular expression");
// replace all mutliple occurances with single occurences represented by r // replace all multiple occurrences with single occurrences represented by r
let new_str = re.replace_all(&return_string, r.to_string()); let new_str = re.replace_all(&return_string, r.to_string());
// update the return string so the next loop has the latest changes // update the return string so the next loop has the latest changes
return_string = new_str.to_string(); return_string = new_str.to_string();

View File

@ -11,13 +11,6 @@ use std::collections::HashMap;
use std::sync::atomic::Ordering; use std::sync::atomic::Ordering;
use std::time::Instant; use std::time::Instant;
#[cfg(feature = "table-pager")]
use {
futures::future::join,
minus::{ExitStrategy, Pager},
std::fmt::Write,
};
const STREAM_PAGE_SIZE: usize = 1000; const STREAM_PAGE_SIZE: usize = 1000;
const STREAM_TIMEOUT_CHECK_INTERVAL: usize = 100; const STREAM_TIMEOUT_CHECK_INTERVAL: usize = 100;
@ -186,28 +179,9 @@ fn table(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let term_width = args.host().lock().width(); let term_width = args.host().lock().width();
#[cfg(feature = "table-pager")]
let pager = Pager::new()
.set_exit_strategy(ExitStrategy::PagerQuit)
.set_searchable(true)
.set_page_if_havent_overflowed(false)
.set_input_handler(Box::new(input_handling::MinusInputHandler {}))
.finish();
let stream_data = async { let stream_data = async {
let finished = Arc::new(AtomicBool::new(false)); let finished = Arc::new(AtomicBool::new(false));
// we are required to clone finished, for use within the callback, otherwise we get borrow errors // we are required to clone finished, for use within the callback, otherwise we get borrow errors
#[cfg(feature = "table-pager")]
let finished_within_callback = finished.clone();
#[cfg(feature = "table-pager")]
{
// This is called when the pager finishes, to indicate to the
// while loop below to finish, in case of long running InputStream consumer
// that doesn't finish by the time the user quits out of the pager
pager.lock().await.add_exit_callback(move || {
finished_within_callback.store(true, Ordering::Relaxed);
});
}
while !finished.clone().load(Ordering::Relaxed) { while !finished.clone().load(Ordering::Relaxed) {
let mut new_input: VecDeque<Value> = VecDeque::new(); let mut new_input: VecDeque<Value> = VecDeque::new();
@ -263,161 +237,22 @@ fn table(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
if !input.is_empty() { if !input.is_empty() {
let t = from_list(&input, &configuration, start_number, &color_hm); let t = from_list(&input, &configuration, start_number, &color_hm);
let output = draw_table(&t, term_width, &color_hm); let output = draw_table(&t, term_width, &color_hm);
#[cfg(feature = "table-pager")]
{
let mut pager = pager.lock().await;
writeln!(pager.lines, "{}", output).map_err(|_| {
ShellError::untagged_runtime_error("Error writing to pager")
})?;
}
#[cfg(not(feature = "table-pager"))]
println!("{}", output); println!("{}", output);
} }
start_number += input.len(); start_number += input.len();
} }
#[cfg(feature = "table-pager")]
{
let mut pager_lock = pager.lock().await;
pager_lock.data_finished();
}
Result::<_, ShellError>::Ok(()) Result::<_, ShellError>::Ok(())
}; };
#[cfg(feature = "table-pager")]
{
let (minus_result, streaming_result) =
block_on(join(minus::async_std_updating(pager.clone()), stream_data));
minus_result.map_err(|_| ShellError::untagged_runtime_error("Error paging data"))?;
streaming_result?;
}
#[cfg(not(feature = "table-pager"))]
block_on(stream_data) block_on(stream_data)
.map_err(|_| ShellError::untagged_runtime_error("Error streaming data"))?; .map_err(|_| ShellError::untagged_runtime_error("Error streaming data"))?;
Ok(OutputStream::empty()) Ok(OutputStream::empty())
} }
#[cfg(feature = "table-pager")]
mod input_handling {
use crossterm::event::{Event, KeyCode, KeyEvent, KeyModifiers, MouseEvent, MouseEventKind};
use minus::{InputEvent, InputHandler, LineNumbers, SearchMode};
pub struct MinusInputHandler;
impl InputHandler for MinusInputHandler {
fn handle_input(
&self,
ev: Event,
upper_mark: usize,
search_mode: SearchMode,
ln: LineNumbers,
rows: usize,
) -> Option<InputEvent> {
match ev {
// Scroll up by one.
Event::Key(KeyEvent {
code: KeyCode::Up,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(upper_mark.saturating_sub(1))),
// Scroll down by one.
Event::Key(KeyEvent {
code: KeyCode::Down,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(upper_mark.saturating_add(1))),
// Mouse scroll up/down
Event::Mouse(MouseEvent {
kind: MouseEventKind::ScrollUp,
..
}) => Some(InputEvent::UpdateUpperMark(upper_mark.saturating_sub(5))),
Event::Mouse(MouseEvent {
kind: MouseEventKind::ScrollDown,
..
}) => Some(InputEvent::UpdateUpperMark(upper_mark.saturating_add(5))),
// Go to top.
Event::Key(KeyEvent {
code: KeyCode::Home,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(0)),
// Go to bottom.
Event::Key(KeyEvent {
code: KeyCode::End,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(usize::MAX)),
// Page Up/Down
Event::Key(KeyEvent {
code: KeyCode::PageUp,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(
upper_mark.saturating_sub(rows - 1),
)),
Event::Key(KeyEvent {
code: KeyCode::PageDown,
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::UpdateUpperMark(
upper_mark.saturating_add(rows - 1),
)),
// Resize event from the terminal.
Event::Resize(_, height) => Some(InputEvent::UpdateRows(height as usize)),
// Switch line number display.
Event::Key(KeyEvent {
code: KeyCode::Char('l'),
modifiers: KeyModifiers::CONTROL,
}) => Some(InputEvent::UpdateLineNumber(!ln)),
// Quit.
Event::Key(KeyEvent {
code: KeyCode::Char('q'),
modifiers: KeyModifiers::NONE,
})
| Event::Key(KeyEvent {
code: KeyCode::Char('Q'),
modifiers: KeyModifiers::SHIFT,
})
| Event::Key(KeyEvent {
code: KeyCode::Esc,
modifiers: KeyModifiers::NONE,
})
| Event::Key(KeyEvent {
code: KeyCode::Char('c'),
modifiers: KeyModifiers::CONTROL,
}) => Some(InputEvent::Exit),
Event::Key(KeyEvent {
code: KeyCode::Char('/'),
modifiers: KeyModifiers::NONE,
}) => Some(InputEvent::Search(SearchMode::Unknown)),
Event::Key(KeyEvent {
code: KeyCode::Down,
modifiers: KeyModifiers::CONTROL,
}) => {
if search_mode == SearchMode::Unknown {
Some(InputEvent::NextMatch)
} else {
None
}
}
Event::Key(KeyEvent {
code: KeyCode::Up,
modifiers: KeyModifiers::CONTROL,
}) => {
if search_mode == SearchMode::Unknown {
Some(InputEvent::PrevMatch)
} else {
None
}
}
_ => None,
}
}
}
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::Command; use super::Command;

View File

@ -76,17 +76,14 @@ impl ConfigExtensions for NuConfig {
fn header_style(&self) -> TextStyle { fn header_style(&self) -> TextStyle {
// FIXME: I agree, this is the long way around, please suggest and alternative. // FIXME: I agree, this is the long way around, please suggest and alternative.
let head_color = get_color_from_key_and_subkey(self, "color_config", "header_color"); let head_color = get_color_from_key_and_subkey(self, "color_config", "header_color");
let head_color_style = match head_color { let (head_color_style, head_bold_bool) = match head_color {
Some(s) => { Some(s) => (
lookup_ansi_color_style(s.as_string().unwrap_or_else(|_| "green".to_string())) lookup_ansi_color_style(s.as_string().unwrap_or_else(|_| "green".to_string())),
} header_bold_from_value(Some(&s)),
None => nu_ansi_term::Color::Green.normal(), ),
}; None => (nu_ansi_term::Color::Green.normal(), true),
let head_bold = get_color_from_key_and_subkey(self, "color_config", "header_bold");
let head_bold_bool = match head_bold {
Some(b) => header_bold_from_value(Some(&b)),
None => true,
}; };
let head_align = get_color_from_key_and_subkey(self, "color_config", "header_align"); let head_align = get_color_from_key_and_subkey(self, "color_config", "header_align");
let head_alignment = match head_align { let head_alignment = match head_align {
Some(a) => header_alignment_from_value(Some(&a)), Some(a) => header_alignment_from_value(Some(&a)),

View File

@ -77,6 +77,7 @@ pub fn create_default_context(interactive: bool) -> Result<EvaluationContext, Bo
// Shells // Shells
whole_stream_command(Next), whole_stream_command(Next),
whole_stream_command(Previous), whole_stream_command(Previous),
whole_stream_command(Goto),
whole_stream_command(Shells), whole_stream_command(Shells),
whole_stream_command(Enter), whole_stream_command(Enter),
whole_stream_command(Exit), whole_stream_command(Exit),
@ -126,6 +127,7 @@ pub fn create_default_context(interactive: bool) -> Result<EvaluationContext, Bo
whole_stream_command(AnsiStrip), whole_stream_command(AnsiStrip),
whole_stream_command(AnsiGradient), whole_stream_command(AnsiGradient),
whole_stream_command(Char), whole_stream_command(Char),
whole_stream_command(DetectColumns),
// Column manipulation // Column manipulation
whole_stream_command(DropColumn), whole_stream_command(DropColumn),
whole_stream_command(MoveColumn), whole_stream_command(MoveColumn),
@ -133,9 +135,11 @@ pub fn create_default_context(interactive: bool) -> Result<EvaluationContext, Bo
whole_stream_command(Select), whole_stream_command(Select),
whole_stream_command(Get), whole_stream_command(Get),
whole_stream_command(Update), whole_stream_command(Update),
whole_stream_command(UpdateCells),
whole_stream_command(Insert), whole_stream_command(Insert),
whole_stream_command(Into), whole_stream_command(Into),
whole_stream_command(IntoBinary), whole_stream_command(IntoBinary),
whole_stream_command(IntoColumnPath),
whole_stream_command(IntoInt), whole_stream_command(IntoInt),
whole_stream_command(IntoFilepath), whole_stream_command(IntoFilepath),
whole_stream_command(IntoFilesize), whole_stream_command(IntoFilesize),
@ -362,14 +366,6 @@ pub fn create_default_context(interactive: bool) -> Result<EvaluationContext, Bo
whole_stream_command(DataFrameCumulative), whole_stream_command(DataFrameCumulative),
whole_stream_command(DataFrameRename), whole_stream_command(DataFrameRename),
]); ]);
#[cfg(feature = "clipboard-cli")]
{
context.add_commands(vec![
whole_stream_command(crate::commands::Clip),
whole_stream_command(crate::commands::Paste),
]);
}
} }
Ok(context) Ok(context)

View File

@ -21,8 +21,8 @@ use crate::commands::{
}; };
use crate::commands::{ use crate::commands::{
Append, BuildString, Collect, Each, Echo, First, Get, Keep, Last, Let, Math, MathMode, Nth, Append, BuildString, Collect, Each, Echo, First, Get, If, IntoInt, Keep, Last, Let, Math,
Select, StrCollect, Wrap, MathMode, Nth, Select, StrCollect, Wrap,
}; };
use nu_engine::{run_block, whole_stream_command, Command, EvaluationContext, WholeStreamCommand}; use nu_engine::{run_block, whole_stream_command, Command, EvaluationContext, WholeStreamCommand};
use nu_stream::InputStream; use nu_stream::InputStream;
@ -41,6 +41,8 @@ pub fn test_examples(cmd: Command) -> Result<(), ShellError> {
whole_stream_command(BuildString {}), whole_stream_command(BuildString {}),
whole_stream_command(First {}), whole_stream_command(First {}),
whole_stream_command(Get {}), whole_stream_command(Get {}),
whole_stream_command(If {}),
whole_stream_command(IntoInt {}),
whole_stream_command(Keep {}), whole_stream_command(Keep {}),
whole_stream_command(Each {}), whole_stream_command(Each {}),
whole_stream_command(Last {}), whole_stream_command(Last {}),
@ -253,6 +255,8 @@ pub fn test_anchors(cmd: Command) -> Result<(), ShellError> {
whole_stream_command(BuildString {}), whole_stream_command(BuildString {}),
whole_stream_command(First {}), whole_stream_command(First {}),
whole_stream_command(Get {}), whole_stream_command(Get {}),
whole_stream_command(If {}),
whole_stream_command(IntoInt {}),
whole_stream_command(Keep {}), whole_stream_command(Keep {}),
whole_stream_command(Each {}), whole_stream_command(Each {}),
whole_stream_command(Last {}), whole_stream_command(Last {}),

View File

@ -8,7 +8,7 @@ fn returns_path_joined_with_column_path() {
cwd: "tests", pipeline( cwd: "tests", pipeline(
r#" r#"
echo [ [name]; [eggs] ] echo [ [name]; [eggs] ]
| path join -a spam.txt name | path join spam.txt -c [ name ]
| get name | get name
"# "#
)); ));
@ -37,7 +37,7 @@ fn appends_slash_when_joined_with_empty_path() {
cwd: "tests", pipeline( cwd: "tests", pipeline(
r#" r#"
echo "/some/dir" echo "/some/dir"
| path join -a '' | path join ''
"# "#
)); ));
@ -51,7 +51,7 @@ fn returns_joined_path_when_joining_empty_path() {
cwd: "tests", pipeline( cwd: "tests", pipeline(
r#" r#"
echo "" echo ""
| path join -a foo.txt | path join foo.txt
"# "#
)); ));

View File

@ -105,7 +105,7 @@ fn parses_column_path_extension() {
cwd: "tests", pipeline( cwd: "tests", pipeline(
r#" r#"
echo [[home, barn]; ['home/viking/spam.txt', 'barn/cow/moo.png']] echo [[home, barn]; ['home/viking/spam.txt', 'barn/cow/moo.png']]
| path parse home barn | path parse -c [ home barn ]
| get barn | get barn
| get extension | get extension
"# "#

View File

@ -37,7 +37,7 @@ fn splits_correctly_with_column_path() {
['home/viking/spam.txt', 'barn/cow/moo.png'] ['home/viking/spam.txt', 'barn/cow/moo.png']
['home/viking/eggs.txt', 'barn/goat/cheese.png'] ['home/viking/eggs.txt', 'barn/goat/cheese.png']
] ]
| path split home barn | path split -c [ home barn ]
| get barn | get barn
| length | length
"# "#

View File

@ -306,3 +306,21 @@ fn rm_wildcard_leading_dot_deletes_dotfiles() {
assert!(!files_exist_at(vec![".bar"], dirs.test())); assert!(!files_exist_at(vec![".bar"], dirs.test()));
}) })
} }
#[test]
fn removes_files_with_case_sensitive_glob_matches_by_default() {
Playground::setup("glob_test", |dirs, sandbox| {
sandbox.with_files(vec![EmptyFile("A0"), EmptyFile("a1")]);
nu!(
cwd: dirs.root(),
"rm glob_test/A*"
);
let deleted_path = dirs.test().join("A0");
let skipped_path = dirs.test().join("a1");
assert!(!deleted_path.exists());
assert!(skipped_path.exists());
})
}

View File

@ -47,3 +47,21 @@ fn writes_out_csv() {
assert!(actual.contains("nu,0.14,A new type of shell,MIT,2018")); assert!(actual.contains("nu,0.14,A new type of shell,MIT,2018"));
}) })
} }
#[test]
fn save_append_will_create_file_if_not_exists() {
Playground::setup("save_test_3", |dirs, sandbox| {
sandbox.with_files(vec![]);
let expected_file = dirs.test().join("new-file.txt");
nu!(
cwd: dirs.root(),
r#"echo hello | save --raw --append save_test_3/new-file.txt"#,
);
let actual = file_contents(expected_file);
println!("{}", actual);
assert!(actual == "hello");
})
}

View File

@ -4,22 +4,19 @@ description = "Completions for nushell"
edition = "2018" edition = "2018"
license = "MIT" license = "MIT"
name = "nu-completion" name = "nu-completion"
version = "0.37.0" version = "0.41.0"
[lib] [lib]
doctest = false doctest = false
[dependencies] [dependencies]
nu-engine = { version = "0.37.0", path="../nu-engine" } nu-engine = { version = "0.41.0", path="../nu-engine" }
nu-data = { version = "0.37.0", path="../nu-data" } nu-data = { version = "0.41.0", path="../nu-data" }
nu-errors = { version = "0.37.0", path="../nu-errors" } nu-parser = { version = "0.41.0", path="../nu-parser" }
nu-parser = { version = "0.37.0", path="../nu-parser" } nu-path = { version = "0.41.0", path="../nu-path" }
nu-path = { version = "0.37.0", path="../nu-path" } nu-protocol = { version = "0.41.0", path="../nu-protocol" }
nu-protocol = { version = "0.37.0", path="../nu-protocol" } nu-source = { version = "0.41.0", path="../nu-source" }
nu-source = { version = "0.37.0", path="../nu-source" } nu-test-support = { version = "0.41.0", path="../nu-test-support" }
nu-test-support = { version = "0.37.0", path="../nu-test-support" }
dirs-next = "2.0.0"
indexmap = { version="1.6.1", features=["serde-1"] } indexmap = { version="1.6.1", features=["serde-1"] }
[target.'cfg(not(target_arch = "wasm32"))'.dependencies] [target.'cfg(not(target_arch = "wasm32"))'.dependencies]

View File

@ -238,11 +238,19 @@ pub fn completion_location(line: &str, block: &Block, pos: usize) -> Vec<Complet
} }
} }
output.push(loc.clone()); output.push({
let mut partial_loc = loc.clone();
partial_loc.span = Span::new(loc.span.start(), pos);
partial_loc
});
output output
} }
} }
_ => vec![loc.clone()], _ => vec![{
let mut partial_loc = loc.clone();
partial_loc.span = Span::new(loc.span.start(), pos);
partial_loc
}],
}; };
} else if pos < loc.span.start() { } else if pos < loc.span.start() {
break; break;
@ -339,7 +347,7 @@ mod tests {
line: &str, line: &str,
scope: &dyn ParserScope, scope: &dyn ParserScope,
pos: usize, pos: usize,
) -> Vec<LocationType> { ) -> Vec<CompletionLocation> {
let (tokens, _) = lex(line, 0, nu_parser::NewlineMode::Normal); let (tokens, _) = lex(line, 0, nu_parser::NewlineMode::Normal);
let (lite_block, _) = parse_block(tokens); let (lite_block, _) = parse_block(tokens);
@ -348,9 +356,6 @@ mod tests {
scope.exit_scope(); scope.exit_scope();
super::completion_location(line, &block, pos) super::completion_location(line, &block, pos)
.into_iter()
.map(|v| v.item)
.collect()
} }
#[test] #[test]
@ -362,7 +367,7 @@ mod tests {
assert_eq!( assert_eq!(
completion_location(line, &registry, 10), completion_location(line, &registry, 10),
vec![LocationType::Command], vec![LocationType::Command.spanned(Span::new(9, 10)),],
); );
} }
@ -373,7 +378,7 @@ mod tests {
assert_eq!( assert_eq!(
completion_location(line, &registry, 10), completion_location(line, &registry, 10),
vec![LocationType::Command], vec![LocationType::Command.spanned(Span::new(9, 10)),],
); );
} }
@ -384,7 +389,7 @@ mod tests {
assert_eq!( assert_eq!(
completion_location(line, &registry, 4), completion_location(line, &registry, 4),
vec![LocationType::Command], vec![LocationType::Command.spanned(Span::new(0, 4)),],
); );
} }
@ -395,7 +400,7 @@ mod tests {
assert_eq!( assert_eq!(
completion_location(line, &registry, 13), completion_location(line, &registry, 13),
vec![LocationType::Variable], vec![LocationType::Variable.spanned(Span::new(5, 13)),],
); );
} }
@ -410,7 +415,7 @@ mod tests {
assert_eq!( assert_eq!(
completion_location(line, &registry, 7), completion_location(line, &registry, 7),
vec![LocationType::Flag("du".to_string())], vec![LocationType::Flag("du".to_string()).spanned(Span::new(3, 7)),],
); );
} }
@ -421,7 +426,7 @@ mod tests {
assert_eq!( assert_eq!(
completion_location(line, &registry, 8), completion_location(line, &registry, 8),
vec![LocationType::Command], vec![LocationType::Command.spanned(Span::new(6, 8)),],
); );
} }
@ -433,8 +438,8 @@ mod tests {
assert_eq!( assert_eq!(
completion_location(line, &registry, 3), completion_location(line, &registry, 3),
vec![ vec![
LocationType::Command, LocationType::Command.spanned(Span::new(0, 3)),
LocationType::Argument(Some("cd".to_string()), None) LocationType::Argument(Some("cd".to_string()), None).spanned(Span::new(3, 3)),
], ],
); );
} }
@ -451,8 +456,8 @@ mod tests {
assert_eq!( assert_eq!(
completion_location(line, &registry, 3), completion_location(line, &registry, 3),
vec![ vec![
LocationType::Argument(Some("du".to_string()), None), LocationType::Argument(Some("du".to_string()), None).spanned(Span::new(3, 4)),
LocationType::Flag("du".to_string()), LocationType::Flag("du".to_string()).spanned(Span::new(3, 4)),
], ],
); );
} }
@ -467,8 +472,24 @@ mod tests {
assert_eq!( assert_eq!(
completion_location(line, &registry, 6), completion_location(line, &registry, 6),
vec![ vec![
LocationType::Command, LocationType::Command.spanned(Span::new(0, 6)),
LocationType::Argument(Some("echo".to_string()), None) LocationType::Argument(Some("echo".to_string()), None).spanned(Span::new(5, 6)),
],
);
}
#[test]
fn completes_argument_when_cursor_inside_argument() {
let registry: VecRegistry =
vec![Signature::build("echo").rest("rest", SyntaxShape::Any, "the values to echo")]
.into();
let line = "echo 123";
assert_eq!(
completion_location(line, &registry, 6),
vec![
LocationType::Command.spanned(Span::new(0, 6)),
LocationType::Argument(Some("echo".to_string()), None).spanned(Span::new(5, 6)),
], ],
); );
} }

View File

@ -4,42 +4,37 @@ description = "CLI for nushell"
edition = "2018" edition = "2018"
license = "MIT" license = "MIT"
name = "nu-data" name = "nu-data"
version = "0.37.0" version = "0.41.0"
[lib] [lib]
doctest = false doctest = false
[dependencies] [dependencies]
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] } bigdecimal = { package = "bigdecimal", version = "0.3.0", features = ["serde"] }
byte-unit = "4.0.9" byte-unit = "4.0.9"
chrono = "0.4.19" chrono = "0.4.19"
common-path = "1.0.0" common-path = "1.0.0"
derive-new = "0.5.8" derive-new = "0.5.8"
directories-next = "2.0.0" directories-next = "2.0.0"
dirs-next = "2.0.0"
getset = "0.1.1" getset = "0.1.1"
indexmap = { version="1.6.1", features=["serde-1"] } indexmap = { version="1.6.1", features=["serde-1"] }
log = "0.4.14" log = "0.4.14"
num-bigint = { version="0.3.1", features=["serde"] } num-bigint = { version="0.4.3", features=["serde"] }
num-format = "0.4.0" num-format = "0.4.0"
num-traits = "0.2.14" num-traits = "0.2.14"
query_interface = "0.3.5"
serde = { version="1.0.123", features=["derive"] } serde = { version="1.0.123", features=["derive"] }
sha2 = "0.9.3" sha2 = "0.9.3"
sys-locale = "0.1.0" sys-locale = "0.1.0"
toml = "0.5.8" toml = "0.5.8"
nu-errors = { version = "0.37.0", path="../nu-errors" } nu-errors = { version = "0.41.0", path="../nu-errors" }
nu-path = { version = "0.37.0", path="../nu-path" } nu-path = { version = "0.41.0", path="../nu-path" }
nu-protocol = { version = "0.37.0", path="../nu-protocol" } nu-protocol = { version = "0.41.0", path="../nu-protocol" }
nu-source = { version = "0.37.0", path="../nu-source" } nu-source = { version = "0.41.0", path="../nu-source" }
nu-table = { version = "0.37.0", path="../nu-table" } nu-table = { version = "0.41.0", path="../nu-table" }
nu-test-support = { version = "0.37.0", path="../nu-test-support" } nu-test-support = { version = "0.41.0", path="../nu-test-support" }
nu-value-ext = { version = "0.37.0", path="../nu-value-ext" } nu-value-ext = { version = "0.41.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.37.0", path="../nu-ansi-term" } nu-ansi-term = { version = "0.41.0", path="../nu-ansi-term" }
[target.'cfg(unix)'.dependencies]
users = "0.11.0"
[features] [features]
dataframe = ["nu-protocol/dataframe"] dataframe = ["nu-protocol/dataframe"]

View File

@ -102,7 +102,6 @@ pub fn string_to_lookup_value(str_prim: &str) -> String {
"separator_color" => "separator_color".to_string(), "separator_color" => "separator_color".to_string(),
"header_align" => "header_align".to_string(), "header_align" => "header_align".to_string(),
"header_color" => "header_color".to_string(), "header_color" => "header_color".to_string(),
"header_bold" => "header_bold".to_string(),
"header_style" => "header_style".to_string(), "header_style" => "header_style".to_string(),
"index_color" => "index_color".to_string(), "index_color" => "index_color".to_string(),
"leading_trailing_space_bg" => "leading_trailing_space_bg".to_string(), "leading_trailing_space_bg" => "leading_trailing_space_bg".to_string(),
@ -144,7 +143,6 @@ pub fn get_color_config(config: &NuConfig) -> HashMap<String, Style> {
hm.insert("separator_color".to_string(), Color::White.normal()); hm.insert("separator_color".to_string(), Color::White.normal());
hm.insert("header_align".to_string(), Color::Green.bold()); hm.insert("header_align".to_string(), Color::Green.bold());
hm.insert("header_color".to_string(), Color::Green.bold()); hm.insert("header_color".to_string(), Color::Green.bold());
hm.insert("header_bold".to_string(), Color::Green.bold());
hm.insert("header_style".to_string(), Style::default()); hm.insert("header_style".to_string(), Style::default());
hm.insert("index_color".to_string(), Color::Green.bold()); hm.insert("index_color".to_string(), Color::Green.bold());
hm.insert( hm.insert(
@ -204,9 +202,6 @@ pub fn get_color_config(config: &NuConfig) -> HashMap<String, Style> {
"header_color" => { "header_color" => {
update_hashmap(key, value, &mut hm); update_hashmap(key, value, &mut hm);
} }
"header_bold" => {
update_hashmap(key, value, &mut hm);
}
"header_style" => { "header_style" => {
update_hashmap(key, value, &mut hm); update_hashmap(key, value, &mut hm);
} }
@ -358,14 +353,7 @@ pub fn style_primitive(primitive: &str, color_hm: &HashMap<String, Style>) -> Te
let style = color_hm.get("header_color"); let style = color_hm.get("header_color");
match style { match style {
Some(s) => TextStyle::with_style(Alignment::Center, *s), Some(s) => TextStyle::with_style(Alignment::Center, *s),
None => TextStyle::default_header(), None => TextStyle::default_header().bold(Some(true)),
}
}
"header_bold" => {
let style = color_hm.get("header_bold");
match style {
Some(s) => TextStyle::with_style(Alignment::Center, *s),
None => TextStyle::default_header(),
} }
} }
"header_style" => { "header_style" => {

View File

@ -4,48 +4,39 @@ description = "Core commands for nushell"
edition = "2018" edition = "2018"
license = "MIT" license = "MIT"
name = "nu-engine" name = "nu-engine"
version = "0.37.0" version = "0.41.0"
[dependencies] [dependencies]
nu-data = { version = "0.37.0", path="../nu-data" } nu-data = { version = "0.41.0", path="../nu-data" }
nu-errors = { version = "0.37.0", path="../nu-errors" } nu-errors = { version = "0.41.0", path="../nu-errors" }
nu-parser = { version = "0.37.0", path="../nu-parser" } nu-parser = { version = "0.41.0", path="../nu-parser" }
nu-plugin = { version = "0.37.0", path="../nu-plugin" } nu-plugin = { version = "0.41.0", path="../nu-plugin" }
nu-protocol = { version = "0.37.0", path="../nu-protocol" } nu-protocol = { version = "0.41.0", path="../nu-protocol" }
nu-source = { version = "0.37.0", path="../nu-source" } nu-source = { version = "0.41.0", path="../nu-source" }
nu-stream = { version = "0.37.0", path="../nu-stream" } nu-stream = { version = "0.41.0", path="../nu-stream" }
nu-value-ext = { version = "0.37.0", path="../nu-value-ext" } nu-value-ext = { version = "0.41.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.37.0", path="../nu-ansi-term" } nu-ansi-term = { version = "0.41.0", path="../nu-ansi-term" }
nu-test-support = { version = "0.37.0", path="../nu-test-support" } nu-test-support = { version = "0.41.0", path="../nu-test-support" }
nu-path = { version = "0.37.0", path="../nu-path" } nu-path = { version = "0.41.0", path="../nu-path" }
trash = { version="1.3.0", optional=true } trash = { version = "2.0.2", optional = true }
which = { version="4.0.2", optional=true } which = { version="4.0.2", optional=true }
codespan-reporting = "0.11.0" codespan-reporting = "0.11.0"
dyn-clone = "1.0.4" bigdecimal = { package = "bigdecimal", version = "0.3.0", features = ["serde"] }
ansi_term = "0.12.1" bytes = "1.1.0"
async-recursion = "0.3.2"
async-trait = "0.1.42"
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] }
bytes = "0.5.6"
chrono = { version="0.4.19", features=["serde"] } chrono = { version="0.4.19", features=["serde"] }
derive-new = "0.5.8" derive-new = "0.5.8"
dirs-next = "2.0.0" dirs-next = "2.0.0"
encoding_rs = "0.8.28" encoding_rs = "0.8.28"
filesize = "0.2.0" filesize = "0.2.0"
fs_extra = "1.2.0" fs_extra = "1.2.0"
futures = { version="0.3.12", features=["compat", "io-compat"] }
futures-util = "0.3.12"
futures_codec = "0.4.1"
getset = "0.1.1" getset = "0.1.1"
glob = "0.3.0" glob = "0.3.0"
indexmap = { version="1.6.1", features=["serde-1"] } indexmap = { version="1.6.1", features=["serde-1"] }
itertools = "0.10.0" itertools = "0.10.0"
lazy_static = "1.*" lazy_static = "1.*"
log = "0.4.14" log = "0.4.14"
num-bigint = { version="0.3.1", features=["serde"] } num-bigint = { version="0.4.3", features=["serde"] }
num-format = "0.4.0"
num-traits = "0.2.14"
parking_lot = "0.11.1" parking_lot = "0.11.1"
rayon = "1.5.0" rayon = "1.5.0"
serde = { version="1.0.123", features=["derive"] } serde = { version="1.0.123", features=["derive"] }
@ -59,7 +50,7 @@ umask = "1.0.0"
users = "0.11.0" users = "0.11.0"
[dev-dependencies] [dev-dependencies]
nu-test-support = { version = "0.37.0", path="../nu-test-support" } nu-test-support = { version = "0.41.0", path="../nu-test-support" }
hamcrest2 = "0.3.0" hamcrest2 = "0.3.0"
[features] [features]

View File

@ -1,8 +1,10 @@
# Nu-Engine # Nu-Engine
Nu-engine handles most of the core logic of nushell. For example, engine handles: Nu-engine handles most of the core logic of nushell. For example, engine handles:
- Passing of data between commands
- Evaluating a commands return values - Passing of data between commands
- Loading of user configurations - Evaluating a commands return values
- Loading of user configurations
## Top level introduction ## Top level introduction
The following topics shall give the reader a top level understanding how various topics are handled in nushell. The following topics shall give the reader a top level understanding how various topics are handled in nushell.

View File

@ -8,22 +8,13 @@ use std::collections::HashMap;
const COMMANDS_DOCS_DIR: &str = "docs/commands"; const COMMANDS_DOCS_DIR: &str = "docs/commands";
#[derive(Default)]
pub struct DocumentationConfig { pub struct DocumentationConfig {
no_subcommands: bool, no_subcommands: bool,
no_color: bool, no_color: bool,
brief: bool, brief: bool,
} }
impl Default for DocumentationConfig {
fn default() -> Self {
DocumentationConfig {
no_subcommands: false,
no_color: false,
brief: false,
}
}
}
fn generate_doc(name: &str, scope: &Scope) -> IndexMap<String, Value> { fn generate_doc(name: &str, scope: &Scope) -> IndexMap<String, Value> {
let mut row_entries = IndexMap::new(); let mut row_entries = IndexMap::new();
let command = scope let command = scope

View File

@ -177,6 +177,9 @@ impl Iterator for InternalIterator {
CommandAction::NextShell => { CommandAction::NextShell => {
self.context.shell_manager().next(); self.context.shell_manager().next();
} }
CommandAction::GotoShell(i) => {
self.context.shell_manager().goto(i);
}
CommandAction::LeaveShell(code) => { CommandAction::LeaveShell(code) => {
self.context.shell_manager().remove_at_current(); self.context.shell_manager().remove_at_current();
if self.context.shell_manager().is_empty() { if self.context.shell_manager().is_empty() {

View File

@ -60,7 +60,7 @@ pub fn nu(scope: &Scope, ctx: &EvaluationContext) -> Result<Value, ShellError> {
// std::env::vars(), rather than the case-sensitive Environment.GetEnvironmentVariables() of .NET that PowerShell // std::env::vars(), rather than the case-sensitive Environment.GetEnvironmentVariables() of .NET that PowerShell
// uses. // uses.
// //
// For now, we work around the discrepency as best we can by merging the two into what is shown to the user as the // For now, we work around the discrepancy as best we can by merging the two into what is shown to the user as the
// 'path' column of `$nu` // 'path' column of `$nu`
let mut table = vec![]; let mut table = vec![];
for v in env { for v in env {

View File

@ -28,6 +28,12 @@ use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, UntaggedValue}; use nu_protocol::{Primitive, ReturnSuccess, UntaggedValue};
use nu_source::Tagged; use nu_source::Tagged;
const GLOB_PARAMS: glob::MatchOptions = glob::MatchOptions {
case_sensitive: true,
require_literal_separator: false,
require_literal_leading_dot: false,
};
#[derive(Eq, PartialEq, Clone, Copy)] #[derive(Eq, PartialEq, Clone, Copy)]
pub enum FilesystemShellMode { pub enum FilesystemShellMode {
Cli, Cli,
@ -159,7 +165,7 @@ impl Shell for FilesystemShell {
let hidden_dir_specified = is_hidden_dir(&path); let hidden_dir_specified = is_hidden_dir(&path);
let mut paths = glob::glob(&path.to_string_lossy()) let mut paths = glob::glob_with(&path.to_string_lossy(), GLOB_PARAMS)
.map_err(|e| ShellError::labeled_error(e.to_string(), "invalid pattern", &p_tag))? .map_err(|e| ShellError::labeled_error(e.to_string(), "invalid pattern", &p_tag))?
.peekable(); .peekable();
@ -352,7 +358,7 @@ impl Shell for FilesystemShell {
let source = path.join(&src.item); let source = path.join(&src.item);
let destination = path.join(&dst.item); let destination = path.join(&dst.item);
let sources: Vec<_> = match glob::glob(&source.to_string_lossy()) { let sources: Vec<_> = match glob::glob_with(&source.to_string_lossy(), GLOB_PARAMS) {
Ok(files) => files.collect(), Ok(files) => files.collect(),
Err(e) => { Err(e) => {
return Err(ShellError::labeled_error( return Err(ShellError::labeled_error(
@ -521,8 +527,8 @@ impl Shell for FilesystemShell {
let source = path.join(&src.item); let source = path.join(&src.item);
let destination = path.join(&dst.item); let destination = path.join(&dst.item);
let mut sources = let mut sources = glob::glob_with(&source.to_string_lossy(), GLOB_PARAMS)
glob::glob(&source.to_string_lossy()).map_or_else(|_| Vec::new(), Iterator::collect); .map_or_else(|_| Vec::new(), Iterator::collect);
if sources.is_empty() { if sources.is_empty() {
return Err(ShellError::labeled_error( return Err(ShellError::labeled_error(
@ -650,7 +656,7 @@ impl Shell for FilesystemShell {
&path.to_string_lossy(), &path.to_string_lossy(),
glob::MatchOptions { glob::MatchOptions {
require_literal_leading_dot: true, require_literal_leading_dot: true,
..Default::default() ..GLOB_PARAMS
}, },
) { ) {
Ok(files) => { Ok(files) => {
@ -878,7 +884,7 @@ impl Shell for FilesystemShell {
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let mut options = OpenOptions::new(); let mut options = OpenOptions::new();
if append { if append {
options.append(true) options.append(true).create(true)
} else { } else {
options.write(true).create(true).truncate(true) options.write(true).create(true).truncate(true)
}; };

View File

@ -34,7 +34,7 @@ impl FileStructure {
self.resources self.resources
.iter() .iter()
.map(|f| (PathBuf::from(&f.loc), f.at)) .map(|f| (PathBuf::from(&f.loc), f.at))
.map(|f| to(f)) .map(to)
.collect() .collect()
} }

View File

@ -101,6 +101,7 @@ impl FromValue for i64 {
v.as_i64() v.as_i64()
} }
} }
impl FromValue for Tagged<i64> { impl FromValue for Tagged<i64> {
fn from_value(v: &Value) -> Result<Self, ShellError> { fn from_value(v: &Value) -> Result<Self, ShellError> {
let tag = v.tag.clone(); let tag = v.tag.clone();

View File

@ -238,7 +238,7 @@ impl Default for Theme {
variable: ThemeColor(Color::Purple), variable: ThemeColor(Color::Purple),
whitespace: ThemeColor(Color::White), whitespace: ThemeColor(Color::White),
word: ThemeColor(Color::Green), word: ThemeColor(Color::Green),
// These should really be Syles and not colors // These should really be Styles and not colors
// leave this here for the next change to make // leave this here for the next change to make
// ThemeColor, ThemeStyle. // ThemeColor, ThemeStyle.
// open_delimiter: ThemeColor(Color::White.normal()), // open_delimiter: ThemeColor(Color::White.normal()),

View File

@ -136,6 +136,16 @@ impl ShellManager {
self.set_path(self.path()) self.set_path(self.path())
} }
pub fn goto(&self, i: usize) {
{
let shell_len = self.shells.lock().len();
if i < shell_len {
self.current_shell.store(i, Ordering::SeqCst);
}
}
self.set_path(self.path())
}
pub fn homedir(&self) -> Option<PathBuf> { pub fn homedir(&self) -> Option<PathBuf> {
let env = self.shells.lock(); let env = self.shells.lock();

View File

@ -597,7 +597,7 @@ impl VarSyntaxShapeDeductor {
} }
Expression::Table(header, _rows) => { Expression::Table(header, _rows) => {
self.infer_shapes_in_table_header(header)?; self.infer_shapes_in_table_header(header)?;
// Shapes within columns can be heterogenous as long as // Shapes within columns can be heterogeneous as long as
// https://github.com/nushell/rfcs/pull/3 // https://github.com/nushell/rfcs/pull/3
// didn't land // didn't land
// self.infer_shapes_in_rows(_rows)?; // self.infer_shapes_in_rows(_rows)?;

View File

@ -4,20 +4,20 @@ description = "Core error subsystem for Nushell"
edition = "2018" edition = "2018"
license = "MIT" license = "MIT"
name = "nu-errors" name = "nu-errors"
version = "0.37.0" version = "0.41.0"
[lib] [lib]
doctest = false doctest = false
[dependencies] [dependencies]
nu-source = { path="../nu-source", version = "0.37.0" } nu-source = { path="../nu-source", version = "0.41.0" }
nu-ansi-term = { version = "0.37.0", path="../nu-ansi-term" } nu-ansi-term = { version = "0.41.0", path="../nu-ansi-term" }
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] } bigdecimal = { package = "bigdecimal", version = "0.3.0", features = ["serde"] }
codespan-reporting = { version="0.11.0", features=["serialization"] } codespan-reporting = { version="0.11.0", features=["serialization"] }
derive-new = "0.5.8" derive-new = "0.5.8"
getset = "0.1.1" getset = "0.1.1"
num-bigint = { version="0.3.1", features=["serde"] } num-bigint = { version="0.4.3", features=["serde"] }
num-traits = "0.2.14" num-traits = "0.2.14"
serde = { version="1.0.123", features=["derive"] } serde = { version="1.0.123", features=["derive"] }

View File

@ -4,7 +4,7 @@ description = "Fork of serde-hjson"
edition = "2018" edition = "2018"
license = "MIT" license = "MIT"
name = "nu-json" name = "nu-json"
version = "0.37.0" version = "0.41.0"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -20,6 +20,6 @@ lazy_static = "1"
linked-hash-map = { version="0.5", optional=true } linked-hash-map = { version="0.5", optional=true }
[dev-dependencies] [dev-dependencies]
nu-path = { version = "0.37.0", path="../nu-path" } nu-path = { version = "0.41.0", path="../nu-path" }
nu-test-support = { version = "0.37.0", path="../nu-test-support" } nu-test-support = { version = "0.41.0", path="../nu-test-support" }
serde_json = "1.0.39" serde_json = "1.0.39"

Some files were not shown because too many files have changed in this diff Show More