Compare commits

...

53 Commits

Author SHA1 Message Date
JT
4e8e03867c Update release.yml 2022-01-19 04:51:21 +11:00
JT
49e8af8ea5 Bump to 0.43 (#4264) 2022-01-18 12:06:12 -05:00
JT
d5d61d14b3 Tutor eq (#4263)
* Fix clippy lints

* Fix clippy lints

* Fix clippy lints

* Add e-q tutor page
2022-01-19 03:22:23 +11:00
JT
f562a4526c Fix clippy lints (#4262)
* Fix clippy lints

* Fix clippy lints

* Fix clippy lints
2022-01-18 23:33:28 +11:00
e6c09f2dfc Update sysinfo version (#4261) 2022-01-18 22:37:52 +11:00
73a68954c4 Bump follow-redirects from 1.14.4 to 1.14.7 in /samples/wasm (#4258)
Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.14.4 to 1.14.7.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.14.4...v1.14.7)

---
updated-dependencies:
- dependency-name: follow-redirects
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-01-18 22:37:06 +11:00
476d543dee Update descriptions for crates split out from nu-cli (#4247)
`nu-command` and `nu-data` were split out, but the descriptions still
say 'CLI'.

Signed-off-by: Michel Alexandre Salim <salimma@fedoraproject.org>
2022-01-09 06:05:50 -06:00
398502b0d6 fix docs/sample_config/config.toml: use env.PROMPT_COMMAND (#4241) 2022-01-02 17:35:07 -06:00
JT
62011b6bcc Bump to 0.42 (#4234) 2021-12-28 20:56:59 +11:00
1214cd57e8 bat: use regex-onig instead of regex-fancy (#4226)
Fixes #4224

Signed-off-by: nibon7 <nibon7@163.com>
2021-12-24 08:34:59 -06:00
6cd124ddb2 allow insecure server connections when using SSL (#4219)
Fixes #4211

Signed-off-by: nibon7 <nibon7@163.com>
2021-12-23 06:48:43 +11:00
d32aec5906 Don't panic if the other end of std{out,err} is closed (#4179)
* fix #4161

println! and friends will panic on BrokenPipe. The solution is to use
writeln! instead, and ignore the error (or do we want to do something else?)

* test that nu doesn't panic in case of BrokenPipe error

* fixup! test that nu doesn't panic in case of BrokenPipe error

* make do_not_panic_if_broken_pipe only run on UNIX systems
2021-12-21 10:08:41 +11:00
e919f9a73b use heck for string casing (#4081)
I removed the Inflector dependency in favor of heck for two reasons:
- to close #3674.
- heck seems simpler and actively maintained

We could probably alter the structure of the `str_` module to expose the
individual casing behaviors better.
I did not feel as confident on changing those signatures.

So I took a lazier approach of a macro in the `mod.rs` that creates the public
shimming function to heck's traits.
2021-12-14 09:43:48 -06:00
a3c349746f ci: update macOS agent (#4207)
10.14 has been deprecated: https://github.com/Azure/azure-sdk-for-cpp/issues/3168

This hopefully fixes recent CI failures!
2021-12-14 08:55:51 -06:00
b5f8f64d79 ci: fix macOS agent (#4203)
I noticed the agent documentation uses uppercase: https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/hosted?view=azure-devops&tabs=yaml
2021-12-13 13:08:03 -06:00
1576b959f9 update feat template (#4201) 2021-12-12 16:15:31 -06:00
4096f52003 update templates2 (#4200) 2021-12-12 16:11:27 -06:00
7ceb668419 Revert "try out title change (#4198)" (#4199)
This reverts commit 420aee18ca.
2021-12-12 16:06:07 -06:00
420aee18ca try out title change (#4198) 2021-12-12 16:05:24 -06:00
pin
15e9c11849 Fix build on NetBSD (#4192) 2021-12-09 14:23:40 +02:00
9fd680ae2b fix: Implicit coercion of boolean false and empty value #4094 (#4120)
Signed-off-by: closetool <c299999999@qq.com>
2021-12-09 14:19:51 +02:00
ad94ed5e13 Fix Configuration section in bug report template (#4181)
* Fix Configuration section in bug report template

Change the placeholder content to actually match the `to md` output, and add `--pretty`

* Don't omit data in placeholder configuration table

* Remove blank line in bug_report.yml
2021-12-08 13:32:28 -06:00
1bdcdcca70 fix: change into column_path to into column-path (breaking change) (#4185) (#4189) 2021-12-08 11:04:55 +02:00
JT
610e3911f6 Bump to 0.41 (#4187) 2021-12-08 06:21:00 +13:00
ee9eddd851 avoid unnecessary allocation (#4178) 2021-12-06 07:38:58 +13:00
JT
c08e145501 Fix clippy warnings (#4176) 2021-12-03 07:05:38 +13:00
c00853a473 Seems like accessing $it outside each is not possible now (#4000) 2021-12-03 06:49:24 +13:00
79c7b20cfd add login shell flag (#4175) 2021-12-02 20:05:04 +13:00
JT
89cbfd758d Remove 'arboard' (#4174) 2021-12-02 08:48:03 +13:00
e6e6b730f3 Bye bye upx sorry (#4173)
* bye bye upx, let's try stripping alone

* remove all stripping - not sure it's even working
2021-11-30 13:34:16 -06:00
0fe6a7c1b5 bye bye upx, let's try stripping alone (#4172) 2021-11-30 12:11:01 -06:00
1794ad51bd Sanitize arguments to external commands a bit better (#4157)
* fix #4140

We are passing commands into a shell underneath but we were not
escaping arguments correctly. This new version of the code also takes
into consideration the ";" and "&" characters, which have special
meaning in shells.

We would probably benefit from a more robust way to join arguments to
shell programs. Python's stdlib has shlex.join, and perhaps we can
take that implementation as a reference.

* clean up escaping of posix shell args

I believe the right place to do escaping of arguments was in the
spawn_sh_command function. Note that this change prevents things like:

^echo "$(ls)"

from executing the ls command. Instead, this will just print

$(ls)

The regex has been taken from the python stdlib implementation of shlex.quote

* fix non-literal parameters and single quotes

* address clippy's comments

* fixup! address clippy's comments

* test that subshell commands are sanitized properly
2021-11-29 09:46:42 -06:00
fb197f562a save --append: create file if it doesn't exist (#4156)
* have save --append create file if not exists

Currently, doing:

echo a | save --raw --append file.txt

will fail if file.txt does not exist. This PR changes that

* test that `save --append` will create new file
2021-11-26 12:27:41 -06:00
91c270c14a fix markup (#4155) 2021-11-26 07:37:50 -06:00
3e93ae8af4 Correct spelling (#4152) 2021-11-25 11:11:20 -06:00
e06df124ca upgrading dependencies (#4135)
* upgrade dependencies
num-bigint 0.3.1 -> 0.4.3
bigdecimal-rs 0.2.1 -> bigdecimal 0.3.0
s3hander 0.7 -> 0.7.5
bat 0.18 -> 0.18, default-features = false

* upgrade arboard 1.1.0 -> 2.0.1

* in polars use comfy-table instead of prettytable-rs
the last release of prettytable-rs was `0.8.0 Sep 27, 2018`
and it uses `term 0.5` as a dependency

* upgrade dependencies

* upgrade trash -> 2.0.1

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-20 07:11:11 -06:00
JT
2590fcbe5c Bump to 0.40 (#4129) 2021-11-16 21:53:03 +13:00
JT
09691ff866 Delete docker-publish.yml 2021-11-16 14:19:35 +13:00
16db368232 upgrade polars to 0.17 (#4122) 2021-11-16 12:01:02 +13:00
JT
df87d90b8c Add 'detect columns' command (#4127)
* Add 'detect columns' command

* Fix warnings
2021-11-16 11:29:54 +13:00
f2f01b8a4d missed from_mp4, added back (#4128) 2021-11-15 16:19:44 -06:00
6c0190cd38 added upx and strip to mac and windows (#4126) 2021-11-15 15:32:48 -06:00
b26246bf12 trying upx and strip (#4125) 2021-11-15 15:01:25 -06:00
36a4effbb2 tweaked strip ci (#4124) 2021-11-15 14:30:32 -06:00
9fca417f8c update release to allow running manually (#4123) 2021-11-15 14:04:00 -06:00
d09e1148b2 add the ability to strip the debug symbols for smaller binaries on mac and linux 2021-11-15 13:47:46 -06:00
493bc2b1c9 Update README (#4118)
`winget install nu` fails because there's other options for "nu" now.
Using the full `nushell` word solved it for me.

[Imgur](https://imgur.com/aqz2qNp)
2021-11-14 19:34:57 +13:00
74b812228c upgrade dependencies (#4116)
* remove unused dependencies

* upgrade dependency bytes 0.5.6 -> 1.1.0

* upgrade dependency heapless 0.6.1 -> 0.7.8

* upgrade dependency image 0.22.4 -> 0.23.14

* upgrade dependency mp4 0.8.2 -> 0.9.0

* upgrade dependency bson 0.14.1 -> 2.0.1

Bson::Undefined, Bson::MaxKey, Bson::MinKey and Bson::DbPointer
weren't present in the previous version.

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-14 19:32:21 +13:00
649b3804c1 fix: panic! during parsing (#4107)
Typing `selector -qa` into nu would cause a `panic!`
This was the case because the inner loop incremented the `idx`
that was only checked in the outer loop and used it to index into
`lite_cmd.parts[idx]`
With the fix we now break loop.

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-05 21:46:46 +13:00
JT
df6a53f52e Update stale.yml (#4106) 2021-11-04 21:25:44 +13:00
JT
c4af5df828 Update stale.yml (#4102) 2021-10-31 16:48:58 +13:00
f94a3e15f5 Get rid of header bold option (#4076)
* refactor(options): get rid of 'header_bold' option

* docs(config): remove 'header_bold' from docs

* fix(options): replicate logic to apply true/false in bold

* style(options): apply lint fixes
2021-10-31 06:59:19 +13:00
75782f0f50 Fix #4070: Inconsistent file matching rule for ls and rm (#4099) 2021-10-28 15:05:07 +03:00
142 changed files with 1825 additions and 1785 deletions

View File

@ -16,7 +16,7 @@ strategy:
image: ubuntu-18.04
style: 'wasm'
macos-stable:
image: macos-10.14
image: macOS-10.15
style: 'unflagged'
windows-stable:
image: windows-2019

View File

@ -1,11 +1,11 @@
name: Bug Report
description: Create a report to help us improve
body:
body:
- type: textarea
id: description
attributes:
label: Describe the bug
description: A clear and concise description of what the bug is.
description: Thank you for your bug report. We are working diligently with our community to integrate our latest code base that we call [engine-q](https://github.com/nushell/engine-q). We would like your help with this by checking to see if this bug report is still needed in engine-q. Thank you for your patience while we ready the next version of nushell.
validations:
required: true
- type: textarea
@ -38,22 +38,20 @@ body:
id: config
attributes:
label: Configuration
description: "Please run `> version | pivot key value | to md` and paste the output to show OS, features, etc"
description: "Please run `version | pivot key value | to md --pretty` and paste the output to show OS, features, etc."
placeholder: |
> version | pivot key value | to md
╭───┬────────────────────┬───────────────────────────────────────────────────────────────────────╮
│ # │ key │ value │
├───┼────────────────────┼───────────────────────────────────────────────────────────────────────┤
│ 0 │ version │ 0.24.1
│ 1 │ build_os │ macos-x86_64 │
│ 2 │ rust_version │ rustc 1.48.0
│ 3 │ cargo_version │ cargo 1.48.0
│ 4 │ pkg_version │ 0.24.1
│ 5 │ build_time │ 2020-12-18 09:54:09 │
│ 6 │ build_rust_channel │ release
│ 7 │ features │ ctrlc, default, directories, dirs, git, ichwh, rich-benchmark,
│ │ │ rustyline, term, uuid, which, zip │
╰───┴────────────────────┴───────────────────────────────────────────────────────────────────────╯
> version | pivot key value | to md --pretty
| key | value |
| ------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| version | 0.40.0 |
| build_os | linux-x86_64 |
| rust_version | rustc 1.56.1 |
| cargo_version | cargo 1.56.0 |
| pkg_version | 0.40.0 |
| build_time | 1980-01-01 00:00:00 +00:00 |
| build_rust_channel | release |
| features | clipboard-cli, ctrlc, dataframe, default, rustyline, term, trash, uuid, which, zip |
| installed_plugins | binaryview, chart bar, chart line, fetch, from bson, from sqlite, inc, match, post, ps, query json, s3, selector, start, sys, textview, to bson, to sqlite, tree, xpath |
validations:
required: false
- type: textarea

View File

@ -5,7 +5,7 @@ body:
id: problem
attributes:
label: Related problem
description: Is your feature request related to a problem? Please describe.
description: Thank you for your feature request. We are working diligently with our community to integrate our latest code base that we call [engine-q](https://github.com/nushell/engine-q). We would like your help with this by checking to see if this feature request is still needed in engine-q. Thank you for your patience while we ready the next version of nushell.
placeholder: |
A clear and concise description of what the problem is.
Example: I am trying to do [...] but [...]

View File

@ -1,118 +0,0 @@
name: Publish consumable Docker images
on:
push:
tags: ['v?[0-9]+.[0-9]+.[0-9]+*']
jobs:
compile:
runs-on: ubuntu-latest
strategy:
matrix:
arch:
- x86_64-unknown-linux-musl
- x86_64-unknown-linux-gnu
steps:
- uses: actions/checkout@v2
- name: Install rust-embedded/cross
env: { VERSION: v0.1.16 }
run: >-
wget -nv https://github.com/rust-embedded/cross/releases/download/${VERSION}/cross-${VERSION}-x86_64-unknown-linux-gnu.tar.gz
-O- | sudo tar xz -C /usr/local/bin/
- name: compile for specific target
env: { arch: '${{ matrix.arch }}' }
run: |
cross build --target ${{ matrix.arch }} --release
# leave only the executable file
rm -frd target/${{ matrix.arch }}/release/{*/*,*.d,*.rlib,.fingerprint}
find . -empty -delete
- uses: actions/upload-artifact@master
with:
name: ${{ matrix.arch }}
path: target/${{ matrix.arch }}/release
docker:
name: Build and publish docker images
needs: compile
runs-on: ubuntu-latest
env:
DOCKER_REGISTRY: quay.io/nushell
DOCKER_PASSWORD: ${{ secrets.DOCKER_REGISTRY }}
DOCKER_USER: ${{ secrets.DOCKER_USER }}
strategy:
matrix:
tag:
- alpine
- slim
- debian
- glibc-busybox
- musl-busybox
- musl-distroless
- glibc-distroless
- glibc
- musl
include:
- { tag: alpine, base-image: alpine, arch: x86_64-unknown-linux-musl, plugin: true, use-patch: false}
- { tag: slim, base-image: 'debian:stable-slim', arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: debian, base-image: debian, arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: glibc-busybox, base-image: 'busybox:glibc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: musl-busybox, base-image: 'busybox:musl', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: musl-distroless, base-image: 'gcr.io/distroless/static', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: glibc-distroless, base-image: 'gcr.io/distroless/cc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: glibc, base-image: scratch, arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: false}
- { tag: musl, base-image: scratch, arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
steps:
- uses: actions/checkout@v2
- uses: actions/download-artifact@master
with: { name: '${{ matrix.arch }}', path: target/release }
- name: Build and publish exact version
run: |-
export DOCKER_TAG=${GITHUB_REF##*/}-${{ matrix.tag }}
export NU_BINS=target/release/$( [ ${{ matrix.plugin }} = true ] && echo nu* || echo nu )
export PATCH=$([ ${{ matrix.use-patch }} = true ] && echo .${{ matrix.tag }} || echo '')
chmod +x $NU_BINS
echo ${DOCKER_PASSWORD} | docker login ${DOCKER_REGISTRY} -u ${DOCKER_USER} --password-stdin
docker-compose --file docker/docker-compose.package.yml build
docker-compose --file docker/docker-compose.package.yml push # exact version
env:
BASE_IMAGE: ${{ matrix.base-image }}
#region semantics tagging
- name: Retag and push with suffixed version
run: |-
VERSION=${GITHUB_REF##*/}
latest_version=${VERSION%%%.*}-${{ matrix.tag }}
latest_feature=${VERSION%%.*}-${{ matrix.tag }}
latest_patch=${VERSION%.*}-${{ matrix.tag }}
exact_version=${VERSION}-${{ matrix.tag }}
tags=( ${latest_version} ${latest_feature} ${latest_patch} ${exact_version} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
docker push ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
- name: Retag and push debian as latest
if: matrix.tag == 'debian'
run: |-
VERSION=${GITHUB_REF##*/}
# ${latest features} ${latest patch} ${exact version}
tags=( ${VERSION%%.*} ${VERSION%.*} ${VERSION} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:latest
docker push ${DOCKER_REGISTRY}/nu:latest
#endregion semantics tagging

View File

@ -1,8 +1,9 @@
name: Create Release Draft
on:
workflow_dispatch:
push:
tags: ['[0-9]+.[0-9]+.[0-9]+*']
tags: ["[0-9]+.[0-9]+.[0-9]+*"]
jobs:
linux:
@ -28,6 +29,60 @@ jobs:
command: build
args: --release --all --features=extra
# - name: Strip binaries (nu)
# run: strip target/release/nu
# - name: Strip binaries (nu_plugin_inc)
# run: strip target/release/nu_plugin_inc
# - name: Strip binaries (nu_plugin_match)
# run: strip target/release/nu_plugin_match
# - name: Strip binaries (nu_plugin_textview)
# run: strip target/release/nu_plugin_textview
# - name: Strip binaries (nu_plugin_binaryview)
# run: strip target/release/nu_plugin_binaryview
# - name: Strip binaries (nu_plugin_chart_bar)
# run: strip target/release/nu_plugin_chart_bar
# - name: Strip binaries (nu_plugin_chart_line)
# run: strip target/release/nu_plugin_chart_line
# - name: Strip binaries (nu_plugin_from_bson)
# run: strip target/release/nu_plugin_from_bson
# - name: Strip binaries (nu_plugin_from_sqlite)
# run: strip target/release/nu_plugin_from_sqlite
# - name: Strip binaries (nu_plugin_from_mp4)
# run: strip target/release/nu_plugin_from_mp4
# - name: Strip binaries (nu_plugin_query_json)
# run: strip target/release/nu_plugin_query_json
# - name: Strip binaries (nu_plugin_s3)
# run: strip target/release/nu_plugin_s3
# - name: Strip binaries (nu_plugin_selector)
# run: strip target/release/nu_plugin_selector
# - name: Strip binaries (nu_plugin_start)
# run: strip target/release/nu_plugin_start
# - name: Strip binaries (nu_plugin_to_bson)
# run: strip target/release/nu_plugin_to_bson
# - name: Strip binaries (nu_plugin_to_sqlite)
# run: strip target/release/nu_plugin_to_sqlite
# - name: Strip binaries (nu_plugin_tree)
# run: strip target/release/nu_plugin_tree
# - name: Strip binaries (nu_plugin_xpath)
# run: strip target/release/nu_plugin_xpath
- name: Create output directory
run: mkdir output
@ -70,6 +125,60 @@ jobs:
command: build
args: --release --all --features=extra
# - name: Strip binaries (nu)
# run: strip target/release/nu
# - name: Strip binaries (nu_plugin_inc)
# run: strip target/release/nu_plugin_inc
# - name: Strip binaries (nu_plugin_match)
# run: strip target/release/nu_plugin_match
# - name: Strip binaries (nu_plugin_textview)
# run: strip target/release/nu_plugin_textview
# - name: Strip binaries (nu_plugin_binaryview)
# run: strip target/release/nu_plugin_binaryview
# - name: Strip binaries (nu_plugin_chart_bar)
# run: strip target/release/nu_plugin_chart_bar
# - name: Strip binaries (nu_plugin_chart_line)
# run: strip target/release/nu_plugin_chart_line
# - name: Strip binaries (nu_plugin_from_bson)
# run: strip target/release/nu_plugin_from_bson
# - name: Strip binaries (nu_plugin_from_sqlite)
# run: strip target/release/nu_plugin_from_sqlite
# - name: Strip binaries (nu_plugin_from_mp4)
# run: strip target/release/nu_plugin_from_mp4
# - name: Strip binaries (nu_plugin_query_json)
# run: strip target/release/nu_plugin_query_json
# - name: Strip binaries (nu_plugin_s3)
# run: strip target/release/nu_plugin_s3
# - name: Strip binaries (nu_plugin_selector)
# run: strip target/release/nu_plugin_selector
# - name: Strip binaries (nu_plugin_start)
# run: strip target/release/nu_plugin_start
# - name: Strip binaries (nu_plugin_to_bson)
# run: strip target/release/nu_plugin_to_bson
# - name: Strip binaries (nu_plugin_to_sqlite)
# run: strip target/release/nu_plugin_to_sqlite
# - name: Strip binaries (nu_plugin_tree)
# run: strip target/release/nu_plugin_tree
# - name: Strip binaries (nu_plugin_xpath)
# run: strip target/release/nu_plugin_xpath
- name: Create output directory
run: mkdir output
@ -106,7 +215,7 @@ jobs:
uses: actions-rs/cargo@v1
with:
command: install
args: cargo-wix
args: cargo-wix --version 0.3.1
- name: Build
uses: actions-rs/cargo@v1
@ -114,6 +223,60 @@ jobs:
command: build
args: --release --all --features=extra
# - name: Strip binaries (nu.exe)
# run: strip target/release/nu.exe
# - name: Strip binaries (nu_plugin_inc.exe)
# run: strip target/release/nu_plugin_inc.exe
# - name: Strip binaries (nu_plugin_match.exe)
# run: strip target/release/nu_plugin_match.exe
# - name: Strip binaries (nu_plugin_textview.exe)
# run: strip target/release/nu_plugin_textview.exe
# - name: Strip binaries (nu_plugin_binaryview.exe)
# run: strip target/release/nu_plugin_binaryview.exe
# - name: Strip binaries (nu_plugin_chart_bar.exe)
# run: strip target/release/nu_plugin_chart_bar.exe
# - name: Strip binaries (nu_plugin_chart_line.exe)
# run: strip target/release/nu_plugin_chart_line.exe
# - name: Strip binaries (nu_plugin_from_bson.exe)
# run: strip target/release/nu_plugin_from_bson.exe
# - name: Strip binaries (nu_plugin_from_sqlite.exe)
# run: strip target/release/nu_plugin_from_sqlite.exe
# - name: Strip binaries (nu_plugin_from_mp4.exe)
# run: strip target/release/nu_plugin_from_mp4.exe
# - name: Strip binaries (nu_plugin_query_json.exe)
# run: strip target/release/nu_plugin_query_json.exe
# - name: Strip binaries (nu_plugin_s3.exe)
# run: strip target/release/nu_plugin_s3.exe
# - name: Strip binaries (nu_plugin_selector.exe)
# run: strip target/release/nu_plugin_selector.exe
# - name: Strip binaries (nu_plugin_start.exe)
# run: strip target/release/nu_plugin_start.exe
# - name: Strip binaries (nu_plugin_to_bson.exe)
# run: strip target/release/nu_plugin_to_bson.exe
# - name: Strip binaries (nu_plugin_to_sqlite.exe)
# run: strip target/release/nu_plugin_to_sqlite.exe
# - name: Strip binaries (nu_plugin_tree.exe)
# run: strip target/release/nu_plugin_tree.exe
# - name: Strip binaries (nu_plugin_xpath.exe)
# run: strip target/release/nu_plugin_xpath.exe
- name: Create output directory
run: mkdir output
@ -274,7 +437,7 @@ jobs:
with:
name: windows-installer
path: ./
- name: Upload Windows installer
uses: actions/upload-release-asset@v1
env:

View File

@ -19,12 +19,10 @@ jobs:
operations-per-run: 520
enable-statistics: true
repo-token: ${{ secrets.GITHUB_TOKEN }}
stale-issue-message: 'This issue is being marked stale because it has been open for 90 days without activity. If you feel that this is in error, please comment below and we will keep it marked as active. Otherwise, it will be closed in 10 days.'
stale-pr-message: 'This PR is being marked stale because it has been open for 45 days without activity. If this PR is still active, please comment below and we will keep it marked as active. Otherwise, it will be closed in 10 days.'
close-issue-message: 'This issue has been marked stale for more than 10 days without activity. Closing this issue, but if you find that the issue is still valid, please reopen.'
close-pr-message: 'This PR has been marked stale for more than 10 days without activity. Closing this PR, but if you are still working on it, please reopen.'
close-issue-message: 'This issue has been marked stale for more than 100000 days without activity. Closing this issue, but if you find that the issue is still valid, please reopen.'
close-pr-message: 'This PR has been marked stale for more than 100 days without activity. Closing this PR, but if you are still working on it, please reopen.'
days-before-issue-stale: 90
days-before-pr-stale: 45
days-before-issue-close: 10
days-before-pr-close: 10
days-before-issue-close: 100000
days-before-pr-close: 100
exempt-issue-labels: 'exempt,keep'

1366
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -10,7 +10,7 @@ license = "MIT"
name = "nu"
readme = "README.md"
repository = "https://github.com/nushell/nushell"
version = "0.39.0"
version = "0.43.0"
[workspace]
members = ["crates/*/"]
@ -18,34 +18,34 @@ members = ["crates/*/"]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
nu-cli = { version = "0.39.0", path="./crates/nu-cli", default-features=false }
nu-command = { version = "0.39.0", path="./crates/nu-command" }
nu-completion = { version = "0.39.0", path="./crates/nu-completion" }
nu-data = { version = "0.39.0", path="./crates/nu-data" }
nu-engine = { version = "0.39.0", path="./crates/nu-engine" }
nu-errors = { version = "0.39.0", path="./crates/nu-errors" }
nu-parser = { version = "0.39.0", path="./crates/nu-parser" }
nu-path = { version = "0.39.0", path="./crates/nu-path" }
nu-plugin = { version = "0.39.0", path="./crates/nu-plugin" }
nu-protocol = { version = "0.39.0", path="./crates/nu-protocol" }
nu-source = { version = "0.39.0", path="./crates/nu-source" }
nu-value-ext = { version = "0.39.0", path="./crates/nu-value-ext" }
nu-cli = { version = "0.43.0", path="./crates/nu-cli", default-features=false }
nu-command = { version = "0.43.0", path="./crates/nu-command" }
nu-completion = { version = "0.43.0", path="./crates/nu-completion" }
nu-data = { version = "0.43.0", path="./crates/nu-data" }
nu-engine = { version = "0.43.0", path="./crates/nu-engine" }
nu-errors = { version = "0.43.0", path="./crates/nu-errors" }
nu-parser = { version = "0.43.0", path="./crates/nu-parser" }
nu-path = { version = "0.43.0", path="./crates/nu-path" }
nu-plugin = { version = "0.43.0", path="./crates/nu-plugin" }
nu-protocol = { version = "0.43.0", path="./crates/nu-protocol" }
nu-source = { version = "0.43.0", path="./crates/nu-source" }
nu-value-ext = { version = "0.43.0", path="./crates/nu-value-ext" }
nu_plugin_binaryview = { version = "0.39.0", path="./crates/nu_plugin_binaryview", optional=true }
nu_plugin_chart = { version = "0.39.0", path="./crates/nu_plugin_chart", optional=true }
nu_plugin_from_bson = { version = "0.39.0", path="./crates/nu_plugin_from_bson", optional=true }
nu_plugin_from_sqlite = { version = "0.39.0", path="./crates/nu_plugin_from_sqlite", optional=true }
nu_plugin_inc = { version = "0.39.0", path="./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.39.0", path="./crates/nu_plugin_match", optional=true }
nu_plugin_query_json = { version = "0.39.0", path="./crates/nu_plugin_query_json", optional=true }
nu_plugin_s3 = { version = "0.39.0", path="./crates/nu_plugin_s3", optional=true }
nu_plugin_selector = { version = "0.39.0", path="./crates/nu_plugin_selector", optional=true }
nu_plugin_start = { version = "0.39.0", path="./crates/nu_plugin_start", optional=true }
nu_plugin_textview = { version = "0.39.0", path="./crates/nu_plugin_textview", optional=true }
nu_plugin_to_bson = { version = "0.39.0", path="./crates/nu_plugin_to_bson", optional=true }
nu_plugin_to_sqlite = { version = "0.39.0", path="./crates/nu_plugin_to_sqlite", optional=true }
nu_plugin_tree = { version = "0.39.0", path="./crates/nu_plugin_tree", optional=true }
nu_plugin_xpath = { version = "0.39.0", path="./crates/nu_plugin_xpath", optional=true }
nu_plugin_binaryview = { version = "0.43.0", path="./crates/nu_plugin_binaryview", optional=true }
nu_plugin_chart = { version = "0.43.0", path="./crates/nu_plugin_chart", optional=true }
nu_plugin_from_bson = { version = "0.43.0", path="./crates/nu_plugin_from_bson", optional=true }
nu_plugin_from_sqlite = { version = "0.43.0", path="./crates/nu_plugin_from_sqlite", optional=true }
nu_plugin_inc = { version = "0.43.0", path="./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.43.0", path="./crates/nu_plugin_match", optional=true }
nu_plugin_query_json = { version = "0.43.0", path="./crates/nu_plugin_query_json", optional=true }
nu_plugin_s3 = { version = "0.43.0", path="./crates/nu_plugin_s3", optional=true }
nu_plugin_selector = { version = "0.43.0", path="./crates/nu_plugin_selector", optional=true }
nu_plugin_start = { version = "0.43.0", path="./crates/nu_plugin_start", optional=true }
nu_plugin_textview = { version = "0.43.0", path="./crates/nu_plugin_textview", optional=true }
nu_plugin_to_bson = { version = "0.43.0", path="./crates/nu_plugin_to_bson", optional=true }
nu_plugin_to_sqlite = { version = "0.43.0", path="./crates/nu_plugin_to_sqlite", optional=true }
nu_plugin_tree = { version = "0.43.0", path="./crates/nu_plugin_tree", optional=true }
nu_plugin_xpath = { version = "0.43.0", path="./crates/nu_plugin_xpath", optional=true }
# Required to bootstrap the main binary
ctrlc = { version="3.1.7", optional=true }
@ -53,7 +53,7 @@ futures = { version="0.3.12", features=["compat", "io-compat"] }
itertools = "0.10.0"
[dev-dependencies]
nu-test-support = { version = "0.39.0", path="./crates/nu-test-support" }
nu-test-support = { version = "0.43.0", path="./crates/nu-test-support" }
serial_test = "0.5.1"
hamcrest2 = "0.3.0"
rstest = "0.10.0"
@ -89,7 +89,6 @@ extra = [
"inc",
"tree",
"textview",
"clipboard-cli",
"trash-support",
"uuid-support",
"start",
@ -113,7 +112,6 @@ textview = ["nu_plugin_textview"]
binaryview = ["nu_plugin_binaryview"]
bson = ["nu_plugin_from_bson", "nu_plugin_to_bson"]
chart = ["nu_plugin_chart"]
clipboard-cli = ["nu-command/clipboard-cli"]
query-json = ["nu_plugin_query_json"]
s3 = ["nu_plugin_s3"]
selector = ["nu_plugin_selector"]

View File

@ -68,7 +68,7 @@ cargo install nu
To install Nu via the [Windows Package Manager](https://aka.ms/winget-cli):
```shell
winget install nu
winget install nushell
```
You can also build Nu yourself with all the bells and whistles (be sure to have installed the [dependencies](https://www.nushell.sh/book/installation.html#dependencies) for your platform), once you have checked out this repo with git:

View File

@ -10,4 +10,4 @@ Foundational libraries are split into two kinds of crates:
Plugins are likewise also split into two types:
* Core plugins - plugins that provide part of the default experience of Nu, including access to the system properties, processes, and web-connectivity features.
* Extra plugins - these plugins run a wide range of differnt capabilities like working with different file types, charting, viewing binary data, and more.
* Extra plugins - these plugins run a wide range of different capabilities like working with different file types, charting, viewing binary data, and more.

View File

@ -9,7 +9,7 @@ description = "Library for ANSI terminal colors and styles (bold, underline)"
edition = "2018"
license = "MIT"
name = "nu-ansi-term"
version = "0.39.0"
version = "0.43.0"
[lib]
doctest = false

View File

@ -231,7 +231,6 @@
#![crate_name = "nu_ansi_term"]
#![crate_type = "rlib"]
#![crate_type = "dylib"]
#![warn(missing_copy_implementations)]
// #![warn(missing_docs)]
#![warn(trivial_casts, trivial_numeric_casts)]

View File

@ -613,7 +613,7 @@ mod serde_json_tests {
let serialized = serde_json::to_string(&color).unwrap();
let deserialized: Color = serde_json::from_str(&serialized).unwrap();
assert_eq!(color, &deserialized);
assert_eq!(color, deserialized);
}
}

View File

@ -4,24 +4,24 @@ description = "CLI for nushell"
edition = "2018"
license = "MIT"
name = "nu-cli"
version = "0.39.0"
version = "0.43.0"
build = "build.rs"
[lib]
doctest = false
[dependencies]
nu-completion = { version = "0.39.0", path="../nu-completion" }
nu-command = { version = "0.39.0", path="../nu-command" }
nu-data = { version = "0.39.0", path="../nu-data" }
nu-engine = { version = "0.39.0", path="../nu-engine" }
nu-errors = { version = "0.39.0", path="../nu-errors" }
nu-parser = { version = "0.39.0", path="../nu-parser" }
nu-protocol = { version = "0.39.0", path="../nu-protocol" }
nu-source = { version = "0.39.0", path="../nu-source" }
nu-stream = { version = "0.39.0", path="../nu-stream" }
nu-ansi-term = { version = "0.39.0", path="../nu-ansi-term" }
nu-path = { version = "0.39.0", path="../nu-path" }
nu-completion = { version = "0.43.0", path="../nu-completion" }
nu-command = { version = "0.43.0", path="../nu-command" }
nu-data = { version = "0.43.0", path="../nu-data" }
nu-engine = { version = "0.43.0", path="../nu-engine" }
nu-errors = { version = "0.43.0", path="../nu-errors" }
nu-parser = { version = "0.43.0", path="../nu-parser" }
nu-protocol = { version = "0.43.0", path="../nu-protocol" }
nu-source = { version = "0.43.0", path="../nu-source" }
nu-stream = { version = "0.43.0", path="../nu-stream" }
nu-ansi-term = { version = "0.43.0", path="../nu-ansi-term" }
nu-path = { version = "0.43.0", path="../nu-path" }
indexmap ="1.6.1"
log = "0.4.14"
@ -29,13 +29,13 @@ pretty_env_logger = "0.4.0"
strip-ansi-escapes = "0.1.0"
rustyline = { version="9.0.0", optional=true }
ctrlc = { version="3.1.7", optional=true }
shadow-rs = { version="0.6", default-features=false, optional=true }
shadow-rs = { version = "0.8.1", default-features = false, optional = true }
serde = { version="1.0.123", features=["derive"] }
serde_yaml = "0.8.16"
lazy_static = "1.4.0"
[build-dependencies]
shadow-rs = "0.6"
shadow-rs = "0.8.1"
[features]
default = ["shadow-rs"]

View File

@ -513,11 +513,6 @@ mod tests {
let args = format!("nu --loglevel={}", level);
ui.parse(&args)?;
assert_eq!(ui.loglevel().unwrap(), Ok(level.to_string()));
let ui = cli_app();
let args = format!("nu -l {}", level);
ui.parse(&args)?;
assert_eq!(ui.loglevel().unwrap(), Ok(level.to_string()));
}
let ui = cli_app();
@ -530,6 +525,17 @@ mod tests {
Ok(())
}
#[test]
fn can_be_login() -> Result<(), ShellError> {
let ui = cli_app();
ui.parse("nu -l")?;
let ui = cli_app();
ui.parse("nu --login")?;
Ok(())
}
#[test]
fn can_be_passed_nu_scripts() -> Result<(), ShellError> {
let ui = cli_app();

View File

@ -91,10 +91,10 @@ pub fn run_script_file(
fn default_prompt_string(cwd: &str) -> String {
format!(
"{}{}{}{}{}{}> ",
Color::Green.bold().prefix().to_string(),
Color::Green.bold().prefix(),
cwd,
nu_ansi_term::ansi::RESET,
Color::Cyan.bold().prefix().to_string(),
Color::Cyan.bold().prefix(),
current_branch(),
nu_ansi_term::ansi::RESET
)

View File

@ -1,39 +1,38 @@
[package]
authors = ["The Nu Project Contributors"]
build = "build.rs"
description = "CLI for nushell"
description = "Commands for Nushell"
edition = "2018"
license = "MIT"
name = "nu-command"
version = "0.39.0"
version = "0.43.0"
[lib]
doctest = false
[dependencies]
nu-data = { version = "0.39.0", path="../nu-data" }
nu-engine = { version = "0.39.0", path="../nu-engine" }
nu-errors = { version = "0.39.0", path="../nu-errors" }
nu-json = { version = "0.39.0", path="../nu-json" }
nu-path = { version = "0.39.0", path="../nu-path" }
nu-parser = { version = "0.39.0", path="../nu-parser" }
nu-plugin = { version = "0.39.0", path="../nu-plugin" }
nu-protocol = { version = "0.39.0", path="../nu-protocol" }
nu-serde = { version = "0.39.0", path="../nu-serde" }
nu-source = { version = "0.39.0", path="../nu-source" }
nu-stream = { version = "0.39.0", path="../nu-stream" }
nu-table = { version = "0.39.0", path="../nu-table" }
nu-test-support = { version = "0.39.0", path="../nu-test-support" }
nu-value-ext = { version = "0.39.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.39.0", path="../nu-ansi-term" }
nu-pretty-hex = { version = "0.39.0", path="../nu-pretty-hex" }
nu-data = { version = "0.43.0", path="../nu-data" }
nu-engine = { version = "0.43.0", path="../nu-engine" }
nu-errors = { version = "0.43.0", path="../nu-errors" }
nu-json = { version = "0.43.0", path="../nu-json" }
nu-path = { version = "0.43.0", path="../nu-path" }
nu-parser = { version = "0.43.0", path="../nu-parser" }
nu-plugin = { version = "0.43.0", path="../nu-plugin" }
nu-protocol = { version = "0.43.0", path="../nu-protocol" }
nu-serde = { version = "0.43.0", path="../nu-serde" }
nu-source = { version = "0.43.0", path="../nu-source" }
nu-stream = { version = "0.43.0", path="../nu-stream" }
nu-table = { version = "0.43.0", path="../nu-table" }
nu-test-support = { version = "0.43.0", path="../nu-test-support" }
nu-value-ext = { version = "0.43.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.43.0", path="../nu-ansi-term" }
nu-pretty-hex = { version = "0.43.0", path="../nu-pretty-hex" }
url = "2.2.1"
mime = "0.3.16"
Inflector = "0.11"
arboard = { version="1.1.0", optional=true }
heck = "0.4.0"
base64 = "0.13.0"
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] }
bigdecimal = { version = "0.3.0", features = ["serde"] }
calamine = "0.18.0"
chrono = { version="0.4.19", features=["serde"] }
chrono-tz = "0.5.3"
@ -56,7 +55,7 @@ lazy_static = "1.*"
log = "0.4.14"
md-5 = "0.9.1"
meval = "0.2.0"
num-bigint = { version="0.3.1", features=["serde"] }
num-bigint = { version="0.4.3", features=["serde"] }
num-format = { version="0.4.0", features=["with-num-bigint"] }
num-traits = "0.2.14"
parking_lot = "0.11.1"
@ -74,14 +73,14 @@ serde_urlencoded = "0.7.0"
serde_yaml = "0.8.16"
sha2 = "0.9.3"
strip-ansi-escapes = "0.1.0"
sysinfo = { version = "0.20.2", optional = true }
sysinfo = { version = "0.23.0", optional = true }
thiserror = "1.0.26"
term = { version="0.7.0", optional=true }
term_size = "0.3.2"
titlecase = "1.1.0"
tokio = { version = "1", features = ["rt-multi-thread"], optional = true }
toml = "0.5.8"
trash = { version="1.3.0", optional=true }
trash = { version = "2.0.2", optional = true }
unicode-segmentation = "1.8"
uuid_crate = { package="uuid", version="0.8.2", features=["v4"], optional=true }
which = { version="4.1.0", optional=true }
@ -89,9 +88,10 @@ zip = { version="0.5.9", optional=true }
digest = "0.9.0"
[dependencies.polars]
version = "0.16.0"
version = "0.17.0"
optional = true
features = ["parquet", "json", "random", "pivot", "strings", "is_in", "temporal", "cum_agg", "rolling_window"]
default-features = false
features = ["docs", "zip_with", "csv-file", "temporal", "performant", "pretty_fmt", "dtype-slim", "parquet", "json", "random", "pivot", "strings", "is_in", "cum_agg", "rolling_window"]
[target.'cfg(unix)'.dependencies]
umask = "1.0.0"
@ -104,7 +104,7 @@ users = "0.11.0"
# num-format = { version = "0.4", features = ["with-system-locale"] }
[build-dependencies]
shadow-rs = "0.6"
shadow-rs = "0.8.1"
[dev-dependencies]
quickcheck = "1.0.3"
@ -112,7 +112,6 @@ quickcheck_macros = "1.0.0"
hamcrest2 = "0.3.0"
[features]
clipboard-cli = ["arboard"]
rustyline-support = ["rustyline"]
stable = []
trash-support = ["trash"]

View File

@ -1,7 +0,0 @@
use derive_new::new;
use nu_protocol::hir;
#[derive(new, Debug)]
pub(crate) struct Command {
pub(crate) args: hir::Call,
}

View File

@ -1,8 +1,10 @@
use crate::prelude::*;
use lazy_static::lazy_static;
use nu_engine::{evaluate_baseline_expr, BufCodecReader};
use nu_engine::{MaybeTextCodec, StringOrBinary};
use nu_test_support::NATIVE_PATH_ENV_VAR;
use parking_lot::Mutex;
use regex::Regex;
#[allow(unused)]
use std::env;
@ -44,20 +46,16 @@ pub(crate) fn run_external_command(
}
#[allow(unused)]
fn trim_double_quotes(input: &str) -> String {
fn trim_enclosing_quotes(input: &str) -> String {
let mut chars = input.chars();
match (chars.next(), chars.next_back()) {
(Some('"'), Some('"')) => chars.collect(),
(Some('\''), Some('\'')) => chars.collect(),
_ => input.to_string(),
}
}
#[allow(unused)]
fn escape_where_needed(input: &str) -> String {
input.split(' ').join("\\ ").split('\'').join("\\'")
}
fn run_with_stdin(
command: ExternalCommand,
context: &mut EvaluationContext,
@ -115,15 +113,9 @@ fn run_with_stdin(
#[cfg(not(windows))]
{
if !_is_literal {
let escaped = escape_double_quotes(&arg);
add_double_quotes(&escaped)
arg
} else {
let trimmed = trim_double_quotes(&arg);
if trimmed != arg {
escape_where_needed(&trimmed)
} else {
trimmed
}
trim_enclosing_quotes(&arg)
}
}
#[cfg(windows)]
@ -131,7 +123,7 @@ fn run_with_stdin(
if let Some(unquoted) = remove_quotes(&arg) {
unquoted.to_string()
} else {
arg.to_string()
arg
}
}
})
@ -172,9 +164,29 @@ fn spawn_cmd_command(command: &ExternalCommand, args: &[String]) -> Command {
process
}
fn has_unsafe_shell_characters(arg: &str) -> bool {
lazy_static! {
static ref RE: Regex = Regex::new(r"[^\w@%+=:,./-]").expect("regex to be valid");
}
RE.is_match(arg)
}
fn shell_arg_escape(arg: &str) -> String {
match arg {
"" => String::from("''"),
s if !has_unsafe_shell_characters(s) => String::from(s),
_ => {
let single_quotes_escaped = arg.split('\'').join("'\"'\"'");
format!("'{}'", single_quotes_escaped)
}
}
}
/// Spawn a sh command with `sh -c args...`
fn spawn_sh_command(command: &ExternalCommand, args: &[String]) -> Command {
let cmd_with_args = vec![command.name.clone(), args.join(" ")].join(" ");
let joined_and_escaped_arguments = args.iter().map(|arg| shell_arg_escape(arg)).join(" ");
let cmd_with_args = vec![command.name.clone(), joined_and_escaped_arguments].join(" ");
let mut process = Command::new("sh");
process.arg("-c").arg(cmd_with_args);
process

View File

@ -1,5 +1 @@
mod dynamic;
pub(crate) mod external;
#[allow(unused_imports)]
pub(crate) use dynamic::Command as DynamicCommand;

View File

@ -38,7 +38,7 @@ impl WholeStreamCommand for SubCommand {
},
Example {
description: "Set coloring options",
example: "config set color_config [[header_align header_bold]; [left $true]]",
example: "config set color_config [[header_align header_color]; [left white_bold]]",
result: None,
},
Example {

View File

@ -7,14 +7,14 @@ pub struct SubCommand;
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"into column_path"
"into column-path"
}
fn signature(&self) -> Signature {
Signature::build("into column_path").rest(
Signature::build("into column-path").rest(
"rest",
SyntaxShape::ColumnPath,
"values to convert to column_path",
"values to convert to column path",
)
}
@ -29,8 +29,8 @@ impl WholeStreamCommand for SubCommand {
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Convert string to column_path in table",
example: "echo [[name]; ['/dev/null'] ['C:\\Program Files'] ['../../Cargo.toml']] | into column_path name",
description: "Convert string to column path in table",
example: "echo [[name]; ['/dev/null'] ['C:\\Program Files'] ['../../Cargo.toml']] | into column-path name",
result: Some(vec![
UntaggedValue::row(indexmap! {
"name".to_string() => UntaggedValue::column_path("/dev/null", Span::unknown()).into(),
@ -47,8 +47,8 @@ impl WholeStreamCommand for SubCommand {
]),
},
Example {
description: "Convert string to column_path",
example: "echo 'Cargo.toml' | into column_path",
description: "Convert string to column path",
example: "echo 'Cargo.toml' | into column-path",
result: Some(vec![UntaggedValue::column_path("Cargo.toml", Span::unknown()).into()]),
},
]
@ -86,7 +86,7 @@ pub fn action(input: &Value, tag: impl Into<Tag>) -> Result<Value, ShellError> {
Primitive::String(a_string) => a_string,
_ => {
return Err(ShellError::unimplemented(
"'into column_path' for non-string primitives",
"'into column-path' for non-string primitives",
))
}
},
@ -94,12 +94,12 @@ pub fn action(input: &Value, tag: impl Into<Tag>) -> Result<Value, ShellError> {
)
.into_value(&tag)),
UntaggedValue::Row(_) => Err(ShellError::labeled_error(
"specify column name to use, with 'into column_path COLUMN'",
"specify column name to use, with 'into column-path COLUMN'",
"found table",
tag,
)),
_ => Err(ShellError::unimplemented(
"'into column_path' for unsupported type",
"'into column-path' for unsupported type",
)),
}
}

View File

@ -15,6 +15,7 @@ impl WholeStreamCommand for Command {
.switch("skip-plugins", "do not load plugins", None)
.switch("no-history", "don't save history", None)
.switch("perf", "show startup performance metrics", None)
.switch("login", "start Nu as if it was a login shell", Some('l'))
.named(
"commands",
SyntaxShape::String,
@ -33,7 +34,7 @@ impl WholeStreamCommand for Command {
"loglevel",
SyntaxShape::String,
"LEVEL: error, warn, info, debug, trace",
Some('l'),
None,
)
.named(
"config-file",

View File

@ -93,7 +93,7 @@ pub fn source(args: CommandArgs) -> Result<OutputStream, ShellError> {
let path = canonicalize(source_file).map_err(|e| {
ShellError::labeled_error(
format!("Can't load source file. Reason: {}", e.to_string()),
format!("Can't load source file. Reason: {}", e),
"Can't load this file",
filename.span(),
)
@ -112,7 +112,7 @@ pub fn source(args: CommandArgs) -> Result<OutputStream, ShellError> {
}
Err(e) => {
ctx.error(ShellError::labeled_error(
format!("Can't load source file. Reason: {}", e.to_string()),
format!("Can't load source file. Reason: {}", e),
"Can't load this file",
filename.span(),
));

View File

@ -81,6 +81,7 @@ fn tutor(args: CommandArgs) -> Result<OutputStream, ShellError> {
vec!["var", "vars", "variable", "variables"],
variable_tutor(),
),
(vec!["engine-q", "e-q"], engineq_tutor()),
(vec!["block", "blocks"], block_tutor()),
(vec!["shorthand", "shorthands"], shorthand_tutor()),
];
@ -370,6 +371,29 @@ same value using:
"#
}
fn engineq_tutor() -> &'static str {
r#"
Engine-q is the upcoming engine for Nushell. Build for speed and correctness,
it also comes with a set of changes from Nushell versions prior to 0.60. To
get ready for engine-q look for some of these changes that might impact your
current scripts:
* Engine-q now uses a few new data structures, including a record syntax
that allows you to model key-value pairs similar to JSON objects.
* Environment variables can now contain more than just strings. Structured
values are converted to strings for external commands using converters.
* `if` will now use an `else` keyword before the else block.
* We're moving from "config.toml" to "config.nu". This means startup will
now be a script file.
* `config` and its subcommands are being replaced by a record that you can
update in the shell which contains all the settings under the variable
`$config`.
* bigint/bigdecimal values are now machine i64 and f64 values
* And more, you can read more about upcoming changes in the up-to-date list
at: https://github.com/nushell/engine-q/issues/522
"#
}
fn display(tag: Tag, scope: &Scope, help: &str) -> OutputStream {
let help = help.split('`');

View File

@ -221,11 +221,6 @@ fn features_enabled() -> Vec<String> {
names.push("zip".to_string());
}
#[cfg(feature = "clipboard-cli")]
{
names.push("clipboard-cli".to_string());
}
#[cfg(feature = "trash-support")]
{
names.push("trash".to_string());

View File

@ -121,7 +121,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let tail = df.as_ref().get_columns().iter().map(|col| {
let count = col.len() as f64;
let sum = match col.sum_as_series().cast_with_dtype(&DataType::Float64) {
let sum = match col.sum_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -144,7 +144,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
_ => None,
};
let min = match col.min_as_series().cast_with_dtype(&DataType::Float64) {
let min = match col.min_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -153,7 +153,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
};
let q_25 = match col.quantile_as_series(0.25) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) {
Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -164,7 +164,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
};
let q_50 = match col.quantile_as_series(0.50) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) {
Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -175,7 +175,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
};
let q_75 = match col.quantile_as_series(0.75) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) {
Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -185,7 +185,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
Err(_) => None,
};
let max = match col.max_as_series().cast_with_dtype(&DataType::Float64) {
let max = match col.max_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,

View File

@ -44,6 +44,12 @@ impl WholeStreamCommand for DataFrame {
"type of join. Inner by default",
Some('t'),
)
.named(
"suffix",
SyntaxShape::String,
"suffix for the columns of the right dataframe",
Some('s'),
)
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
@ -104,6 +110,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let r_df: Value = args.req(0)?;
let l_col: Vec<Value> = args.req_named("left")?;
let r_col: Vec<Value> = args.req_named("right")?;
let r_suffix: Option<Tagged<String>> = args.get_flag("suffix")?;
let join_type_op: Option<Tagged<String>> = args.get_flag("type")?;
let join_type = match join_type_op {
@ -124,6 +131,8 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
},
};
let suffix = r_suffix.map(|s| s.item);
let (l_col_string, l_col_span) = convert_columns(&l_col, &tag)?;
let (r_col_string, r_col_span) = convert_columns(&r_col, &tag)?;
@ -142,7 +151,13 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
)?;
df.as_ref()
.join(r_df.as_ref(), &l_col_string, &r_col_string, join_type)
.join(
r_df.as_ref(),
&l_col_string,
&r_col_string,
join_type,
suffix,
)
.map_err(|e| parse_polars_error::<&str>(&e, &l_col_span, None))
}
_ => Err(ShellError::labeled_error(

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.day().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.hour().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.minute().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.month().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.nanosecond().into_series();

View File

@ -56,7 +56,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.ordinal().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.second().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.week().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.weekday().into_series();

View File

@ -56,7 +56,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.year().into_series();

View File

@ -6,7 +6,7 @@ use nu_protocol::{
Signature, SyntaxShape, UntaggedValue,
};
use nu_source::Tagged;
use polars::prelude::DataType;
use polars::prelude::{DataType, RollingOptions};
enum RollType {
Min,
@ -57,7 +57,6 @@ impl WholeStreamCommand for DataFrame {
Signature::build("dataframe rolling")
.required("type", SyntaxShape::String, "rolling operation")
.required("window", SyntaxShape::Int, "Window size for rolling")
.switch("ignore_nulls", "Ignore nulls in column", Some('i'))
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
@ -112,7 +111,6 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let roll_type: Tagged<String> = args.req(0)?;
let window_size: Tagged<i64> = args.req(1)?;
let ignore_nulls = args.has_flag("ignore_nulls");
let (df, df_tag) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?;
let series = df.as_series(&df_tag.span)?;
@ -126,31 +124,17 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
}
let roll_type = RollType::from_str(&roll_type.item, &roll_type.tag.span)?;
let rolling_opts = RollingOptions {
window_size: window_size.item as usize,
min_periods: window_size.item as usize,
weights: None,
center: false,
};
let res = match roll_type {
RollType::Max => series.rolling_max(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Min => series.rolling_min(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Sum => series.rolling_sum(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Mean => series.rolling_mean(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Max => series.rolling_max(rolling_opts),
RollType::Min => series.rolling_min(rolling_opts),
RollType::Sum => series.rolling_sum(rolling_opts),
RollType::Mean => series.rolling_mean(rolling_opts),
};
let mut res = res.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;

View File

@ -78,7 +78,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let casted = match indices.dtype() {
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => indices
.as_ref()
.cast_with_dtype(&DataType::UInt32)
.cast(&DataType::UInt32)
.map_err(|e| parse_polars_error::<&str>(&e, &value.tag.span, None)),
_ => Err(ShellError::labeled_error_with_secondary(
"Incorrect type",

View File

@ -58,7 +58,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.strftime(&fmt.item).into_series();

View File

@ -92,7 +92,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let casted = match series.dtype() {
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => series
.as_ref()
.cast_with_dtype(&DataType::UInt32)
.cast(&DataType::UInt32)
.map_err(|e| parse_polars_error::<&str>(&e, &value.tag.span, None)),
_ => Err(ShellError::labeled_error_with_secondary(
"Incorrect type",

View File

@ -73,9 +73,9 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let writer = CsvWriter::new(&mut file);
let writer = if no_header {
writer.has_headers(false)
writer.has_header(false)
} else {
writer.has_headers(true)
writer.has_header(true)
};
let writer = match delimiter {

View File

@ -11,12 +11,6 @@ use nu_protocol::{
pub struct WithEnv;
#[derive(Deserialize, Debug)]
struct WithEnvArgs {
variable: Value,
block: CapturedBlock,
}
impl WholeStreamCommand for WithEnv {
fn name(&self) -> &str {
"with-env"

View File

@ -197,7 +197,7 @@ fn process_row(
} else {
let mut obj = input.clone();
for column in column_paths.clone() {
for column in column_paths {
let path = UntaggedValue::Primitive(Primitive::ColumnPath(column.clone()))
.into_value(tag);
let data = r.get_data(&as_string(&path)?).borrow().clone();

View File

@ -163,7 +163,7 @@ fn get_current_date() -> (i32, u32, u32) {
fn add_months_of_year_to_table(
args: &CommandArgs,
mut calendar_vec_deque: &mut VecDeque<Value>,
calendar_vec_deque: &mut VecDeque<Value>,
tag: &Tag,
selected_year: i32,
(start_month, end_month): (u32, u32),
@ -181,7 +181,7 @@ fn add_months_of_year_to_table(
let add_month_to_table_result = add_month_to_table(
args,
&mut calendar_vec_deque,
calendar_vec_deque,
tag,
selected_year,
month_number,

View File

@ -36,6 +36,7 @@ impl WholeStreamCommand for Command {
Some('p'),
)
.switch("raw", "fetch contents as text rather than a table", Some('r'))
.switch("insecure", "allow insecure server connections when using SSL", Some('k'))
.filter()
}
@ -78,6 +79,7 @@ fn run_fetch(args: CommandArgs) -> Result<ActionStream, ShellError> {
)
})?,
fetch_helper.has_raw,
fetch_helper.has_insecure,
fetch_helper.user.clone(),
fetch_helper.password,
))]
@ -92,6 +94,7 @@ pub struct Fetch {
pub path: Option<Value>,
pub tag: Tag,
pub has_raw: bool,
pub has_insecure: bool,
pub user: Option<String>,
pub password: Option<String>,
}
@ -102,6 +105,7 @@ impl Fetch {
path: None,
tag: Tag::unknown(),
has_raw: false,
has_insecure: false,
user: None,
password: None,
}
@ -121,6 +125,8 @@ impl Fetch {
self.has_raw = args.has_flag("raw");
self.has_insecure = args.has_flag("insecure");
self.user = args.get_flag("user")?;
self.password = args.get_flag("password")?;
@ -132,13 +138,14 @@ impl Fetch {
pub async fn fetch(
path: &Value,
has_raw: bool,
has_insecure: bool,
user: Option<String>,
password: Option<String>,
) -> ReturnValue {
let path_str = path.as_string()?;
let path_span = path.tag.span;
let result = helper(&path_str, path_span, has_raw, user, password).await;
let result = helper(&path_str, path_span, has_raw, has_insecure, user, password).await;
if let Err(e) = result {
return Err(e);
@ -168,6 +175,7 @@ async fn helper(
location: &str,
span: Span,
has_raw: bool,
has_insecure: bool,
user: Option<String>,
password: Option<String>,
) -> std::result::Result<(Option<String>, Value), ShellError> {
@ -188,7 +196,7 @@ async fn helper(
_ => None,
};
let client = http_client();
let client = http_client(has_insecure);
let mut request = client.get(url);
if let Some(login) = login {
@ -360,10 +368,10 @@ async fn helper(
// Only panics if the user agent is invalid but we define it statically so either
// it always or never fails
#[allow(clippy::unwrap_used)]
fn http_client() -> reqwest::Client {
fn http_client(allow_insecure: bool) -> reqwest::Client {
reqwest::Client::builder()
.user_agent("nushell")
.danger_accept_invalid_certs(allow_insecure)
.build()
.unwrap()
.expect("Failed to build reqwest client")
}

View File

@ -53,6 +53,11 @@ impl WholeStreamCommand for Command {
"return values as a string instead of a table",
Some('r'),
)
.switch(
"insecure",
"allow insecure server connections when using SSL",
Some('k'),
)
.filter()
}
@ -91,6 +96,7 @@ fn run_post(args: CommandArgs) -> Result<ActionStream, ShellError> {
ShellError::labeled_error("expected a 'path'", "expected a 'path'", &helper.tag)
})?,
helper.has_raw,
helper.has_insecure,
&helper.body.clone().ok_or_else(|| {
ShellError::labeled_error("expected a 'body'", "expected a 'body'", &helper.tag)
})?,
@ -114,6 +120,7 @@ pub enum HeaderKind {
pub struct Post {
pub path: Option<Value>,
pub has_raw: bool,
pub has_insecure: bool,
pub body: Option<Value>,
pub user: Option<String>,
pub password: Option<String>,
@ -126,6 +133,7 @@ impl Post {
Post {
path: None,
has_raw: false,
has_insecure: false,
body: None,
user: None,
password: None,
@ -156,6 +164,8 @@ impl Post {
self.has_raw = args.has_flag("raw");
self.has_insecure = args.has_flag("insecure");
self.user = args.get_flag("user")?;
self.password = args.get_flag("password")?;
@ -169,6 +179,7 @@ impl Post {
pub async fn post_helper(
path: &Value,
has_raw: bool,
has_insecure: bool,
body: &Value,
user: Option<String>,
password: Option<String>,
@ -177,8 +188,16 @@ pub async fn post_helper(
let path_tag = path.tag.clone();
let path_str = path.as_string()?;
let (file_extension, contents, contents_tag) =
post(&path_str, body, user, password, headers, path_tag.clone()).await?;
let (file_extension, contents, contents_tag) = post(
&path_str,
has_insecure,
body,
user,
password,
headers,
path_tag.clone(),
)
.await?;
let file_extension = if has_raw {
None
@ -202,6 +221,7 @@ pub async fn post_helper(
pub async fn post(
location: &str,
allow_insecure: bool,
body: &Value,
user: Option<String>,
password: Option<String>,
@ -219,7 +239,9 @@ pub async fn post(
value: UntaggedValue::Primitive(Primitive::String(body_str)),
..
} => {
let mut s = http_client().post(location).body(body_str.to_string());
let mut s = http_client(allow_insecure)
.post(location)
.body(body_str.to_string());
if let Some(login) = login {
s = s.header("Authorization", format!("Basic {}", login));
}
@ -237,7 +259,9 @@ pub async fn post(
value: UntaggedValue::Primitive(Primitive::Binary(b)),
..
} => {
let mut s = http_client().post(location).body(Vec::from(&b[..]));
let mut s = http_client(allow_insecure)
.post(location)
.body(Vec::from(&b[..]));
if let Some(login) = login {
s = s.header("Authorization", format!("Basic {}", login));
}
@ -247,7 +271,9 @@ pub async fn post(
match value_to_json_value(&value.clone().into_untagged_value()) {
Ok(json_value) => match serde_json::to_string(&json_value) {
Ok(result_string) => {
let mut s = http_client().post(location).body(result_string);
let mut s = http_client(allow_insecure)
.post(location)
.body(result_string);
if let Some(login) = login {
s = s.header("Authorization", format!("Basic {}", login));
@ -611,10 +637,10 @@ fn extract_header_value(args: &CommandArgs, key: &str) -> Result<Option<String>,
// Only panics if the user agent is invalid but we define it statically so either
// it always or never fails
#[allow(clippy::unwrap_used)]
fn http_client() -> reqwest::Client {
fn http_client(allow_insecure: bool) -> reqwest::Client {
reqwest::Client::builder()
.user_agent("nushell")
.danger_accept_invalid_certs(allow_insecure)
.build()
.unwrap()
.expect("Failed to build reqwest client")
}

View File

@ -170,7 +170,7 @@ fn action(
Ok(UntaggedValue::string(gradient_string).into_value(tag))
}
(None, Some(fg_end), None, Some(bg_end)) => {
// missin fg_start and bg_start, so assume black
// missing fg_start and bg_start, so assume black
let fg_start = Rgb::new(0, 0, 0);
let bg_start = Rgb::new(0, 0, 0);
let fg_gradient = Gradient::new(fg_start, fg_end);

View File

@ -1,105 +0,0 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, Value};
use arboard::Clipboard;
pub struct Clip;
impl WholeStreamCommand for Clip {
fn name(&self) -> &str {
"clip"
}
fn signature(&self) -> Signature {
Signature::build("clip")
}
fn usage(&self) -> &str {
"Copy the contents of the pipeline to the copy/paste buffer."
}
fn run_with_actions(&self, args: CommandArgs) -> Result<ActionStream, ShellError> {
clip(args)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Save text to the clipboard",
example: "echo 'secret value' | clip",
result: None,
},
Example {
description: "Save numbers to the clipboard",
example: "random integer 10000000..99999999 | clip",
result: None,
},
]
}
}
pub fn clip(args: CommandArgs) -> Result<ActionStream, ShellError> {
let input = args.input;
let name = args.call_info.name_tag;
let values: Vec<Value> = input.collect();
if let Ok(mut clip_context) = Clipboard::new() {
let mut new_copy_data = String::new();
if !values.is_empty() {
let mut first = true;
for i in &values {
if !first {
new_copy_data.push('\n');
} else {
first = false;
}
let string: String = i.convert_to_string();
if string.is_empty() {
return Err(ShellError::labeled_error(
"Unable to convert to string",
"Unable to convert to string",
name,
));
}
new_copy_data.push_str(&string);
}
}
match clip_context.set_text(new_copy_data) {
Ok(_) => {}
Err(_) => {
return Err(ShellError::labeled_error(
"Could not set contents of clipboard",
"could not set contents of clipboard",
name,
));
}
}
} else {
return Err(ShellError::labeled_error(
"Could not open clipboard",
"could not open clipboard",
name,
));
}
Ok(ActionStream::empty())
}
#[cfg(test)]
mod tests {
use super::Clip;
use super::ShellError;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(Clip {})
}
}

View File

@ -104,7 +104,7 @@ fn kill(args: CommandArgs) -> Result<ActionStream, ShellError> {
}
cmd.arg("-9");
} else if let Some(signal_value) = signal {
cmd.arg(format!("-{}", signal_value.item().to_string()));
cmd.arg(format!("-{}", signal_value.item()));
}
cmd.arg(pid.item().to_string());

View File

@ -1,13 +1,9 @@
mod ansi;
mod benchmark;
mod clear;
#[cfg(feature = "clipboard-cli")]
mod clip;
mod du;
mod exec;
mod kill;
#[cfg(feature = "clipboard-cli")]
mod paste;
mod pwd;
mod run_external;
mod sleep;
@ -17,13 +13,9 @@ mod which_;
pub use ansi::*;
pub use benchmark::Benchmark;
pub use clear::Clear;
#[cfg(feature = "clipboard-cli")]
pub use clip::Clip;
pub use du::Du;
pub use exec::Exec;
pub use kill::Kill;
#[cfg(feature = "clipboard-cli")]
pub use paste::Paste;
pub use pwd::Pwd;
pub use run_external::RunExternalCommand;
pub use sleep::Sleep;

View File

@ -1,61 +0,0 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue};
use arboard::Clipboard;
pub struct Paste;
impl WholeStreamCommand for Paste {
fn name(&self) -> &str {
"paste"
}
fn signature(&self) -> Signature {
Signature::build("paste")
}
fn usage(&self) -> &str {
"Paste contents from the clipboard"
}
fn run_with_actions(&self, args: CommandArgs) -> Result<ActionStream, ShellError> {
paste(args)
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Paste text from your clipboard",
example: "echo 'secret value' | clip | paste",
result: Some(vec![UntaggedValue::Primitive(Primitive::String(
"secret value".to_owned(),
))
.into_value(Tag::default())]),
}]
}
}
pub fn paste(args: CommandArgs) -> Result<ActionStream, ShellError> {
let name = args.call_info.name_tag;
if let Ok(mut clip_context) = Clipboard::new() {
match clip_context.get_text() {
Ok(out) => Ok(ActionStream::one(ReturnSuccess::value(
UntaggedValue::Primitive(Primitive::String(out)),
))),
Err(_) => Err(ShellError::labeled_error(
"Could not get contents of clipboard",
"could not get contents of clipboard",
name,
)),
}
} else {
Err(ShellError::labeled_error(
"Could not open clipboard",
"could not open clipboard",
name,
))
}
}

View File

@ -6,12 +6,6 @@ use nu_protocol::{Dictionary, Signature, UntaggedValue};
pub struct TermSize;
#[derive(Deserialize, Clone)]
pub struct TermSizeArgs {
wide: bool,
tall: bool,
}
impl WholeStreamCommand for TermSize {
fn name(&self) -> &str {
"term size"

View File

@ -0,0 +1,283 @@
use std::{iter::Peekable, str::CharIndices};
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue};
use nu_source::Spanned;
type Input<'t> = Peekable<CharIndices<'t>>;
pub struct DetectColumns;
impl WholeStreamCommand for DetectColumns {
fn name(&self) -> &str {
"detect columns"
}
fn signature(&self) -> Signature {
Signature::build("detect columns")
.named(
"skip",
SyntaxShape::Int,
"number of rows to skip before detecting",
Some('s'),
)
.switch("no_headers", "don't detect headers", Some('n'))
}
fn usage(&self) -> &str {
"splits contents across multiple columns via the separator."
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
detect_columns(args)
}
}
fn detect_columns(args: CommandArgs) -> Result<OutputStream, ShellError> {
let name_tag = args.name_tag();
let num_rows_to_skip: Option<usize> = args.get_flag("skip")?;
let noheader = args.has_flag("no_headers");
let input = args.input.collect_string(name_tag.clone())?;
let input: Vec<_> = input
.lines()
.skip(num_rows_to_skip.unwrap_or_default())
.map(|x| x.to_string())
.collect();
let mut input = input.into_iter();
let headers = input.next();
if let Some(orig_headers) = headers {
let headers = find_columns(&orig_headers);
Ok((if noheader {
vec![orig_headers].into_iter().chain(input)
} else {
vec![].into_iter().chain(input)
})
.map(move |x| {
let row = find_columns(&x);
let mut dict = TaggedDictBuilder::new(name_tag.clone());
if headers.len() == row.len() && !noheader {
for (header, val) in headers.iter().zip(row.iter()) {
dict.insert_untagged(&header.item, UntaggedValue::string(&val.item));
}
} else {
let mut pre_output = vec![];
// column counts don't line up, so see if we can figure out why
for cell in row {
for header in &headers {
if cell.span.start() <= header.span.end()
&& cell.span.end() > header.span.start()
{
pre_output
.push((header.item.to_string(), UntaggedValue::string(&cell.item)));
}
}
}
for header in &headers {
let mut found = false;
for pre_o in &pre_output {
if pre_o.0 == header.item {
found = true;
break;
}
}
if !found {
pre_output.push((header.item.to_string(), UntaggedValue::nothing()));
}
}
if noheader {
for header in headers.iter().enumerate() {
for pre_o in &pre_output {
if pre_o.0 == header.1.item {
dict.insert_untagged(format!("Column{}", header.0), pre_o.1.clone())
}
}
}
} else {
for header in &headers {
for pre_o in &pre_output {
if pre_o.0 == header.item {
dict.insert_untagged(&header.item, pre_o.1.clone())
}
}
}
}
}
dict.into_value()
})
.into_output_stream())
} else {
Ok(OutputStream::empty())
}
}
pub fn find_columns(input: &str) -> Vec<Spanned<String>> {
let mut chars = input.char_indices().peekable();
let mut output = vec![];
while let Some((_, c)) = chars.peek() {
if c.is_whitespace() {
// If the next character is non-newline whitespace, skip it.
let _ = chars.next();
} else {
// Otherwise, try to consume an unclassified token.
let result = baseline(&mut chars);
output.push(result);
}
}
output
}
#[derive(Clone, Copy)]
enum BlockKind {
Paren,
CurlyBracket,
SquareBracket,
}
fn baseline(src: &mut Input) -> Spanned<String> {
let mut token_contents = String::new();
let start_offset = if let Some((pos, _)) = src.peek() {
*pos
} else {
0
};
// This variable tracks the starting character of a string literal, so that
// we remain inside the string literal lexer mode until we encounter the
// closing quote.
let mut quote_start: Option<char> = None;
// This Vec tracks paired delimiters
let mut block_level: Vec<BlockKind> = vec![];
// A baseline token is terminated if it's not nested inside of a paired
// delimiter and the next character is one of: `|`, `;`, `#` or any
// whitespace.
fn is_termination(block_level: &[BlockKind], c: char) -> bool {
block_level.is_empty() && (c.is_whitespace())
}
// The process of slurping up a baseline token repeats:
//
// - String literal, which begins with `'`, `"` or `\``, and continues until
// the same character is encountered again.
// - Delimiter pair, which begins with `[`, `(`, or `{`, and continues until
// the matching closing delimiter is found, skipping comments and string
// literals.
// - When not nested inside of a delimiter pair, when a terminating
// character (whitespace, `|`, `;` or `#`) is encountered, the baseline
// token is done.
// - Otherwise, accumulate the character into the current baseline token.
while let Some((_, c)) = src.peek() {
let c = *c;
if quote_start.is_some() {
// If we encountered the closing quote character for the current
// string, we're done with the current string.
if Some(c) == quote_start {
quote_start = None;
}
} else if c == '\n' {
if is_termination(&block_level, c) {
break;
}
} else if c == '\'' || c == '"' || c == '`' {
// We encountered the opening quote of a string literal.
quote_start = Some(c);
} else if c == '[' {
// We encountered an opening `[` delimiter.
block_level.push(BlockKind::SquareBracket);
} else if c == ']' {
// We encountered a closing `]` delimiter. Pop off the opening `[`
// delimiter.
if let Some(BlockKind::SquareBracket) = block_level.last() {
let _ = block_level.pop();
}
} else if c == '{' {
// We encountered an opening `{` delimiter.
block_level.push(BlockKind::CurlyBracket);
} else if c == '}' {
// We encountered a closing `}` delimiter. Pop off the opening `{`.
if let Some(BlockKind::CurlyBracket) = block_level.last() {
let _ = block_level.pop();
}
} else if c == '(' {
// We enceountered an opening `(` delimiter.
block_level.push(BlockKind::Paren);
} else if c == ')' {
// We encountered a closing `)` delimiter. Pop off the opening `(`.
if let Some(BlockKind::Paren) = block_level.last() {
let _ = block_level.pop();
}
} else if is_termination(&block_level, c) {
break;
}
// Otherwise, accumulate the character into the current token.
token_contents.push(c);
// Consume the character.
let _ = src.next();
}
let span = Span::new(start_offset, start_offset + token_contents.len());
// If there is still unclosed opening delimiters, close them and add
// synthetic closing characters to the accumulated token.
if block_level.last().is_some() {
// let delim: char = (*block).closing();
// let cause = ParseError::unexpected_eof(delim.to_string(), span);
// while let Some(bk) = block_level.pop() {
// token_contents.push(bk.closing());
// }
return token_contents.spanned(span);
}
if quote_start.is_some() {
// The non-lite parse trims quotes on both sides, so we add the expected quote so that
// anyone wanting to consume this partial parse (e.g., completions) will be able to get
// correct information from the non-lite parse.
// token_contents.push(delimiter);
// return (
// token_contents.spanned(span),
// Some(ParseError::unexpected_eof(delimiter.to_string(), span)),
// );
return token_contents.spanned(span);
}
token_contents.spanned(span)
}
#[cfg(test)]
mod tests {
use super::DetectColumns;
use super::ShellError;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(DetectColumns {})
}
}

View File

@ -0,0 +1,3 @@
pub mod columns;
pub use columns::DetectColumns;

View File

@ -1,5 +1,6 @@
mod build_string;
mod char_;
mod detect;
mod format;
mod lines;
mod parse;
@ -10,6 +11,7 @@ mod str_;
pub use build_string::BuildString;
pub use char_::Char;
pub use detect::DetectColumns;
pub use format::*;
pub use lines::Lines;
pub use parse::*;

View File

@ -1,6 +1,5 @@
use super::operate;
use super::{operate, to_lower_camel_case};
use crate::prelude::*;
use inflector::cases::camelcase::to_camel_case;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};
@ -25,7 +24,7 @@ impl WholeStreamCommand for SubCommand {
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
operate(args, &to_camel_case)
operate(args, &to_lower_camel_case)
}
fn examples(&self) -> Vec<Example> {
@ -40,7 +39,7 @@ impl WholeStreamCommand for SubCommand {
#[cfg(test)]
mod tests {
use super::ShellError;
use super::{to_camel_case, SubCommand};
use super::{to_lower_camel_case, SubCommand};
use crate::commands::strings::str_::case::action;
use nu_source::Tag;
use nu_test_support::value::string;
@ -57,7 +56,7 @@ mod tests {
let word = string("this-is-the-first-case");
let expected = string("thisIsTheFirstCase");
let actual = action(&word, Tag::unknown(), &to_camel_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_lower_camel_case).unwrap();
assert_eq!(actual, expected);
}
#[test]
@ -65,7 +64,7 @@ mod tests {
let word = string("this_is_the_second_case");
let expected = string("thisIsTheSecondCase");
let actual = action(&word, Tag::unknown(), &to_camel_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_lower_camel_case).unwrap();
assert_eq!(actual, expected);
}
}

View File

@ -1,6 +1,5 @@
use super::operate;
use super::{operate, to_kebab_case};
use crate::prelude::*;
use inflector::cases::kebabcase::to_kebab_case;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};

View File

@ -16,6 +16,24 @@ pub use pascal_case::SubCommand as PascalCase;
pub use screaming_snake_case::SubCommand as ScreamingSnakeCase;
pub use snake_case::SubCommand as SnakeCase;
use heck::ToKebabCase;
use heck::ToLowerCamelCase;
use heck::ToShoutySnakeCase;
use heck::ToSnakeCase;
use heck::ToUpperCamelCase;
macro_rules! create_heck_function {
($func_name:ident) => {
pub fn $func_name(a_slice: &str) -> String {
a_slice.$func_name()
}
};
}
create_heck_function!(to_upper_camel_case);
create_heck_function!(to_lower_camel_case);
create_heck_function!(to_kebab_case);
create_heck_function!(to_shouty_snake_case);
create_heck_function!(to_snake_case);
struct Arguments {
column_paths: Vec<ColumnPath>,
}

View File

@ -1,6 +1,5 @@
use super::operate;
use super::{operate, to_upper_camel_case};
use crate::prelude::*;
use inflector::cases::pascalcase::to_pascal_case;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};
@ -25,7 +24,7 @@ impl WholeStreamCommand for SubCommand {
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
operate(args, &to_pascal_case)
operate(args, &to_upper_camel_case)
}
fn examples(&self) -> Vec<Example> {
@ -40,7 +39,7 @@ impl WholeStreamCommand for SubCommand {
#[cfg(test)]
mod tests {
use super::ShellError;
use super::{to_pascal_case, SubCommand};
use super::{to_upper_camel_case, SubCommand};
use crate::commands::strings::str_::case::action;
use nu_source::Tag;
use nu_test_support::value::string;
@ -57,7 +56,7 @@ mod tests {
let word = string("this-is-the-first-case");
let expected = string("ThisIsTheFirstCase");
let actual = action(&word, Tag::unknown(), &to_pascal_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_upper_camel_case).unwrap();
assert_eq!(actual, expected);
}
#[test]
@ -65,7 +64,7 @@ mod tests {
let word = string("this_is_the_second_case");
let expected = string("ThisIsTheSecondCase");
let actual = action(&word, Tag::unknown(), &to_pascal_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_upper_camel_case).unwrap();
assert_eq!(actual, expected);
}
}

View File

@ -1,6 +1,5 @@
use super::operate;
use super::{operate, to_shouty_snake_case};
use crate::prelude::*;
use inflector::cases::screamingsnakecase::to_screaming_snake_case;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};
@ -25,7 +24,7 @@ impl WholeStreamCommand for SubCommand {
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
operate(args, &to_screaming_snake_case)
operate(args, &to_shouty_snake_case)
}
fn examples(&self) -> Vec<Example> {
@ -40,7 +39,7 @@ impl WholeStreamCommand for SubCommand {
#[cfg(test)]
mod tests {
use super::ShellError;
use super::{to_screaming_snake_case, SubCommand};
use super::{to_shouty_snake_case, SubCommand};
use crate::commands::strings::str_::case::action;
use nu_source::Tag;
use nu_test_support::value::string;
@ -57,7 +56,7 @@ mod tests {
let word = string("this-is-the-first-case");
let expected = string("THIS_IS_THE_FIRST_CASE");
let actual = action(&word, Tag::unknown(), &to_screaming_snake_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_shouty_snake_case).unwrap();
assert_eq!(actual, expected);
}
#[test]
@ -65,7 +64,7 @@ mod tests {
let word = string("this_is_the_second_case");
let expected = string("THIS_IS_THE_SECOND_CASE");
let actual = action(&word, Tag::unknown(), &to_screaming_snake_case).unwrap();
let actual = action(&word, Tag::unknown(), &to_shouty_snake_case).unwrap();
assert_eq!(actual, expected);
}
}

View File

@ -1,6 +1,5 @@
use super::operate;
use super::{operate, to_snake_case};
use crate::prelude::*;
use inflector::cases::snakecase::to_snake_case;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};

View File

@ -144,7 +144,7 @@ fn trim(s: &str, char_: Option<char>, closure_flags: &ClosureFlags) -> String {
let re_str = format!("{}{{2,}}", reg);
// create the regex
let re = regex::Regex::new(&re_str).expect("Error creating regular expression");
// replace all mutliple occurances with single occurences represented by r
// replace all multiple occurrences with single occurrences represented by r
let new_str = re.replace_all(&return_string, r.to_string());
// update the return string so the next loop has the latest changes
return_string = new_str.to_string();

View File

@ -1,7 +1,7 @@
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, TaggedDictBuilder, UntaggedValue};
use sysinfo::{ProcessExt, System, SystemExt};
use sysinfo::{PidExt, ProcessExt, System, SystemExt};
pub struct Command;
@ -50,7 +50,7 @@ fn run_ps(args: CommandArgs) -> Result<OutputStream, ShellError> {
for pid in result {
if let Some(result) = sys.process(pid) {
let mut dict = TaggedDictBuilder::new(args.name_tag());
dict.insert_untagged("pid", UntaggedValue::int(pid as i64));
dict.insert_untagged("pid", UntaggedValue::int(pid.as_u32() as i64));
dict.insert_untagged("name", UntaggedValue::string(result.name()));
dict.insert_untagged(
"status",
@ -68,7 +68,7 @@ fn run_ps(args: CommandArgs) -> Result<OutputStream, ShellError> {
if long {
if let Some(parent) = result.parent() {
dict.insert_untagged("parent", UntaggedValue::int(parent as i64));
dict.insert_untagged("parent", UntaggedValue::int(parent.as_u32() as i64));
} else {
dict.insert_untagged("parent", UntaggedValue::nothing());
}

View File

@ -76,17 +76,14 @@ impl ConfigExtensions for NuConfig {
fn header_style(&self) -> TextStyle {
// FIXME: I agree, this is the long way around, please suggest and alternative.
let head_color = get_color_from_key_and_subkey(self, "color_config", "header_color");
let head_color_style = match head_color {
Some(s) => {
lookup_ansi_color_style(s.as_string().unwrap_or_else(|_| "green".to_string()))
}
None => nu_ansi_term::Color::Green.normal(),
};
let head_bold = get_color_from_key_and_subkey(self, "color_config", "header_bold");
let head_bold_bool = match head_bold {
Some(b) => header_bold_from_value(Some(&b)),
None => true,
let (head_color_style, head_bold_bool) = match head_color {
Some(s) => (
lookup_ansi_color_style(s.as_string().unwrap_or_else(|_| "green".to_string())),
header_bold_from_value(Some(&s)),
),
None => (nu_ansi_term::Color::Green.normal(), true),
};
let head_align = get_color_from_key_and_subkey(self, "color_config", "header_align");
let head_alignment = match head_align {
Some(a) => header_alignment_from_value(Some(&a)),

View File

@ -127,6 +127,7 @@ pub fn create_default_context(interactive: bool) -> Result<EvaluationContext, Bo
whole_stream_command(AnsiStrip),
whole_stream_command(AnsiGradient),
whole_stream_command(Char),
whole_stream_command(DetectColumns),
// Column manipulation
whole_stream_command(DropColumn),
whole_stream_command(MoveColumn),
@ -365,14 +366,6 @@ pub fn create_default_context(interactive: bool) -> Result<EvaluationContext, Bo
whole_stream_command(DataFrameCumulative),
whole_stream_command(DataFrameRename),
]);
#[cfg(feature = "clipboard-cli")]
{
context.add_commands(vec![
whole_stream_command(crate::commands::Clip),
whole_stream_command(crate::commands::Paste),
]);
}
}
Ok(context)

View File

@ -306,3 +306,21 @@ fn rm_wildcard_leading_dot_deletes_dotfiles() {
assert!(!files_exist_at(vec![".bar"], dirs.test()));
})
}
#[test]
fn removes_files_with_case_sensitive_glob_matches_by_default() {
Playground::setup("glob_test", |dirs, sandbox| {
sandbox.with_files(vec![EmptyFile("A0"), EmptyFile("a1")]);
nu!(
cwd: dirs.root(),
"rm glob_test/A*"
);
let deleted_path = dirs.test().join("A0");
let skipped_path = dirs.test().join("a1");
assert!(!deleted_path.exists());
assert!(skipped_path.exists());
})
}

View File

@ -47,3 +47,21 @@ fn writes_out_csv() {
assert!(actual.contains("nu,0.14,A new type of shell,MIT,2018"));
})
}
#[test]
fn save_append_will_create_file_if_not_exists() {
Playground::setup("save_test_3", |dirs, sandbox| {
sandbox.with_files(vec![]);
let expected_file = dirs.test().join("new-file.txt");
nu!(
cwd: dirs.root(),
r#"echo hello | save --raw --append save_test_3/new-file.txt"#,
);
let actual = file_contents(expected_file);
println!("{}", actual);
assert!(actual == "hello");
})
}

View File

@ -4,19 +4,19 @@ description = "Completions for nushell"
edition = "2018"
license = "MIT"
name = "nu-completion"
version = "0.39.0"
version = "0.43.0"
[lib]
doctest = false
[dependencies]
nu-engine = { version = "0.39.0", path="../nu-engine" }
nu-data = { version = "0.39.0", path="../nu-data" }
nu-parser = { version = "0.39.0", path="../nu-parser" }
nu-path = { version = "0.39.0", path="../nu-path" }
nu-protocol = { version = "0.39.0", path="../nu-protocol" }
nu-source = { version = "0.39.0", path="../nu-source" }
nu-test-support = { version = "0.39.0", path="../nu-test-support" }
nu-engine = { version = "0.43.0", path="../nu-engine" }
nu-data = { version = "0.43.0", path="../nu-data" }
nu-parser = { version = "0.43.0", path="../nu-parser" }
nu-path = { version = "0.43.0", path="../nu-path" }
nu-protocol = { version = "0.43.0", path="../nu-protocol" }
nu-source = { version = "0.43.0", path="../nu-source" }
nu-test-support = { version = "0.43.0", path="../nu-test-support" }
indexmap = { version="1.6.1", features=["serde-1"] }
[target.'cfg(not(target_arch = "wasm32"))'.dependencies]

View File

@ -1,16 +1,16 @@
[package]
authors = ["The Nu Project Contributors"]
description = "CLI for nushell"
description = "Data for Nushell"
edition = "2018"
license = "MIT"
name = "nu-data"
version = "0.39.0"
version = "0.43.0"
[lib]
doctest = false
[dependencies]
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] }
bigdecimal = { package = "bigdecimal", version = "0.3.0", features = ["serde"] }
byte-unit = "4.0.9"
chrono = "0.4.19"
common-path = "1.0.0"
@ -19,7 +19,7 @@ directories-next = "2.0.0"
getset = "0.1.1"
indexmap = { version="1.6.1", features=["serde-1"] }
log = "0.4.14"
num-bigint = { version="0.3.1", features=["serde"] }
num-bigint = { version="0.4.3", features=["serde"] }
num-format = "0.4.0"
num-traits = "0.2.14"
serde = { version="1.0.123", features=["derive"] }
@ -27,17 +27,14 @@ sha2 = "0.9.3"
sys-locale = "0.1.0"
toml = "0.5.8"
nu-errors = { version = "0.39.0", path="../nu-errors" }
nu-path = { version = "0.39.0", path="../nu-path" }
nu-protocol = { version = "0.39.0", path="../nu-protocol" }
nu-source = { version = "0.39.0", path="../nu-source" }
nu-table = { version = "0.39.0", path="../nu-table" }
nu-test-support = { version = "0.39.0", path="../nu-test-support" }
nu-value-ext = { version = "0.39.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.39.0", path="../nu-ansi-term" }
[target.'cfg(unix)'.dependencies]
users = "0.11.0"
nu-errors = { version = "0.43.0", path="../nu-errors" }
nu-path = { version = "0.43.0", path="../nu-path" }
nu-protocol = { version = "0.43.0", path="../nu-protocol" }
nu-source = { version = "0.43.0", path="../nu-source" }
nu-table = { version = "0.43.0", path="../nu-table" }
nu-test-support = { version = "0.43.0", path="../nu-test-support" }
nu-value-ext = { version = "0.43.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.43.0", path="../nu-ansi-term" }
[features]
dataframe = ["nu-protocol/dataframe"]

View File

@ -164,8 +164,8 @@ pub fn coerce_compare_primitive(
(Date(left), Date(right)) => CompareValues::Date(*left, *right),
(Date(left), Duration(right)) => CompareValues::DateDuration(*left, right.clone()),
(Boolean(left), Boolean(right)) => CompareValues::Booleans(*left, *right),
(Boolean(left), Nothing) => CompareValues::Booleans(*left, false),
(Nothing, Boolean(right)) => CompareValues::Booleans(false, *right),
(Boolean(left), Nothing) => CompareValues::Ints(if *left { 1 } else { 0 }, -1),
(Nothing, Boolean(right)) => CompareValues::Ints(-1, if *right { 1 } else { 0 }),
(String(left), Nothing) => CompareValues::String(left.clone(), std::string::String::new()),
(Nothing, String(right)) => {
CompareValues::String(std::string::String::new(), right.clone())

View File

@ -102,7 +102,6 @@ pub fn string_to_lookup_value(str_prim: &str) -> String {
"separator_color" => "separator_color".to_string(),
"header_align" => "header_align".to_string(),
"header_color" => "header_color".to_string(),
"header_bold" => "header_bold".to_string(),
"header_style" => "header_style".to_string(),
"index_color" => "index_color".to_string(),
"leading_trailing_space_bg" => "leading_trailing_space_bg".to_string(),
@ -144,7 +143,6 @@ pub fn get_color_config(config: &NuConfig) -> HashMap<String, Style> {
hm.insert("separator_color".to_string(), Color::White.normal());
hm.insert("header_align".to_string(), Color::Green.bold());
hm.insert("header_color".to_string(), Color::Green.bold());
hm.insert("header_bold".to_string(), Color::Green.bold());
hm.insert("header_style".to_string(), Style::default());
hm.insert("index_color".to_string(), Color::Green.bold());
hm.insert(
@ -204,9 +202,6 @@ pub fn get_color_config(config: &NuConfig) -> HashMap<String, Style> {
"header_color" => {
update_hashmap(key, value, &mut hm);
}
"header_bold" => {
update_hashmap(key, value, &mut hm);
}
"header_style" => {
update_hashmap(key, value, &mut hm);
}
@ -358,14 +353,7 @@ pub fn style_primitive(primitive: &str, color_hm: &HashMap<String, Style>) -> Te
let style = color_hm.get("header_color");
match style {
Some(s) => TextStyle::with_style(Alignment::Center, *s),
None => TextStyle::default_header(),
}
}
"header_bold" => {
let style = color_hm.get("header_bold");
match style {
Some(s) => TextStyle::with_style(Alignment::Center, *s),
None => TextStyle::default_header(),
None => TextStyle::default_header().bold(Some(true)),
}
}
"header_style" => {

View File

@ -4,27 +4,26 @@ description = "Core commands for nushell"
edition = "2018"
license = "MIT"
name = "nu-engine"
version = "0.39.0"
version = "0.43.0"
[dependencies]
nu-data = { version = "0.39.0", path="../nu-data" }
nu-errors = { version = "0.39.0", path="../nu-errors" }
nu-parser = { version = "0.39.0", path="../nu-parser" }
nu-plugin = { version = "0.39.0", path="../nu-plugin" }
nu-protocol = { version = "0.39.0", path="../nu-protocol" }
nu-source = { version = "0.39.0", path="../nu-source" }
nu-stream = { version = "0.39.0", path="../nu-stream" }
nu-value-ext = { version = "0.39.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.39.0", path="../nu-ansi-term" }
nu-test-support = { version = "0.39.0", path="../nu-test-support" }
nu-path = { version = "0.39.0", path="../nu-path" }
nu-data = { version = "0.43.0", path="../nu-data" }
nu-errors = { version = "0.43.0", path="../nu-errors" }
nu-parser = { version = "0.43.0", path="../nu-parser" }
nu-plugin = { version = "0.43.0", path="../nu-plugin" }
nu-protocol = { version = "0.43.0", path="../nu-protocol" }
nu-source = { version = "0.43.0", path="../nu-source" }
nu-stream = { version = "0.43.0", path="../nu-stream" }
nu-value-ext = { version = "0.43.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.43.0", path="../nu-ansi-term" }
nu-test-support = { version = "0.43.0", path="../nu-test-support" }
nu-path = { version = "0.43.0", path="../nu-path" }
trash = { version="1.3.0", optional=true }
trash = { version = "2.0.2", optional = true }
which = { version="4.0.2", optional=true }
codespan-reporting = "0.11.0"
ansi_term = "0.12.1"
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] }
bytes = "0.5.6"
bigdecimal = { package = "bigdecimal", version = "0.3.0", features = ["serde"] }
bytes = "1.1.0"
chrono = { version="0.4.19", features=["serde"] }
derive-new = "0.5.8"
dirs-next = "2.0.0"
@ -37,7 +36,7 @@ indexmap = { version="1.6.1", features=["serde-1"] }
itertools = "0.10.0"
lazy_static = "1.*"
log = "0.4.14"
num-bigint = { version="0.3.1", features=["serde"] }
num-bigint = { version="0.4.3", features=["serde"] }
parking_lot = "0.11.1"
rayon = "1.5.0"
serde = { version="1.0.123", features=["derive"] }
@ -51,7 +50,7 @@ umask = "1.0.0"
users = "0.11.0"
[dev-dependencies]
nu-test-support = { version = "0.39.0", path="../nu-test-support" }
nu-test-support = { version = "0.43.0", path="../nu-test-support" }
hamcrest2 = "0.3.0"
[features]

View File

@ -1,8 +1,10 @@
# Nu-Engine
Nu-engine handles most of the core logic of nushell. For example, engine handles:
- Passing of data between commands
- Evaluating a commands return values
- Loading of user configurations
- Passing of data between commands
- Evaluating a commands return values
- Loading of user configurations
## Top level introduction
The following topics shall give the reader a top level understanding how various topics are handled in nushell.

View File

@ -8,22 +8,13 @@ use std::collections::HashMap;
const COMMANDS_DOCS_DIR: &str = "docs/commands";
#[derive(Default)]
pub struct DocumentationConfig {
no_subcommands: bool,
no_color: bool,
brief: bool,
}
impl Default for DocumentationConfig {
fn default() -> Self {
DocumentationConfig {
no_subcommands: false,
no_color: false,
brief: false,
}
}
}
fn generate_doc(name: &str, scope: &Scope) -> IndexMap<String, Value> {
let mut row_entries = IndexMap::new();
let command = scope

View File

@ -60,7 +60,7 @@ pub fn nu(scope: &Scope, ctx: &EvaluationContext) -> Result<Value, ShellError> {
// std::env::vars(), rather than the case-sensitive Environment.GetEnvironmentVariables() of .NET that PowerShell
// uses.
//
// For now, we work around the discrepency as best we can by merging the two into what is shown to the user as the
// For now, we work around the discrepancy as best we can by merging the two into what is shown to the user as the
// 'path' column of `$nu`
let mut table = vec![];
for v in env {

View File

@ -28,6 +28,12 @@ use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, UntaggedValue};
use nu_source::Tagged;
const GLOB_PARAMS: glob::MatchOptions = glob::MatchOptions {
case_sensitive: true,
require_literal_separator: false,
require_literal_leading_dot: false,
};
#[derive(Eq, PartialEq, Clone, Copy)]
pub enum FilesystemShellMode {
Cli,
@ -159,7 +165,7 @@ impl Shell for FilesystemShell {
let hidden_dir_specified = is_hidden_dir(&path);
let mut paths = glob::glob(&path.to_string_lossy())
let mut paths = glob::glob_with(&path.to_string_lossy(), GLOB_PARAMS)
.map_err(|e| ShellError::labeled_error(e.to_string(), "invalid pattern", &p_tag))?
.peekable();
@ -352,7 +358,7 @@ impl Shell for FilesystemShell {
let source = path.join(&src.item);
let destination = path.join(&dst.item);
let sources: Vec<_> = match glob::glob(&source.to_string_lossy()) {
let sources: Vec<_> = match glob::glob_with(&source.to_string_lossy(), GLOB_PARAMS) {
Ok(files) => files.collect(),
Err(e) => {
return Err(ShellError::labeled_error(
@ -521,8 +527,8 @@ impl Shell for FilesystemShell {
let source = path.join(&src.item);
let destination = path.join(&dst.item);
let mut sources =
glob::glob(&source.to_string_lossy()).map_or_else(|_| Vec::new(), Iterator::collect);
let mut sources = glob::glob_with(&source.to_string_lossy(), GLOB_PARAMS)
.map_or_else(|_| Vec::new(), Iterator::collect);
if sources.is_empty() {
return Err(ShellError::labeled_error(
@ -650,7 +656,7 @@ impl Shell for FilesystemShell {
&path.to_string_lossy(),
glob::MatchOptions {
require_literal_leading_dot: true,
..Default::default()
..GLOB_PARAMS
},
) {
Ok(files) => {
@ -878,7 +884,7 @@ impl Shell for FilesystemShell {
) -> Result<OutputStream, ShellError> {
let mut options = OpenOptions::new();
if append {
options.append(true)
options.append(true).create(true)
} else {
options.write(true).create(true).truncate(true)
};
@ -964,7 +970,7 @@ fn move_item(from: &Path, from_tag: &Tag, to: &Path) -> Result<(), ShellError> {
} {
Ok(_) => Ok(()),
Err(e) => Err(ShellError::labeled_error(
format!("Could not move {:?} to {:?}. {:}", from, to, e.to_string()),
format!("Could not move {:?} to {:?}. {:}", from, to, e),
"could not move",
from_tag,
)),

View File

@ -34,7 +34,7 @@ impl FileStructure {
self.resources
.iter()
.map(|f| (PathBuf::from(&f.loc), f.at))
.map(|f| to(f))
.map(to)
.collect()
}

View File

@ -147,7 +147,7 @@ pub fn process_script(
{
val.to_string()
} else {
format!("{}\\", name.to_string())
format!("{}\\", name)
}
} else {
name.to_string()

View File

@ -238,7 +238,7 @@ impl Default for Theme {
variable: ThemeColor(Color::Purple),
whitespace: ThemeColor(Color::White),
word: ThemeColor(Color::Green),
// These should really be Syles and not colors
// These should really be Styles and not colors
// leave this here for the next change to make
// ThemeColor, ThemeStyle.
// open_delimiter: ThemeColor(Color::White.normal()),

View File

@ -597,7 +597,7 @@ impl VarSyntaxShapeDeductor {
}
Expression::Table(header, _rows) => {
self.infer_shapes_in_table_header(header)?;
// Shapes within columns can be heterogenous as long as
// Shapes within columns can be heterogeneous as long as
// https://github.com/nushell/rfcs/pull/3
// didn't land
// self.infer_shapes_in_rows(_rows)?;

View File

@ -34,3 +34,21 @@ fn compare_to_nothing() {
);
assert_eq!(actual.out, "true");
}
#[test]
fn compare_nothing_and_boolean() {
let actual = nu!(
cwd: ".",
r#"
if $true == $nothing {echo $true} {echo $false}
"#
);
assert_eq!(actual.out, "false");
let actual = nu!(
cwd: ".",
r#"
if $false == $nothing {echo $true} {echo $false}
"#
);
assert_eq!(actual.out, "false");
}

View File

@ -4,20 +4,20 @@ description = "Core error subsystem for Nushell"
edition = "2018"
license = "MIT"
name = "nu-errors"
version = "0.39.0"
version = "0.43.0"
[lib]
doctest = false
[dependencies]
nu-source = { path="../nu-source", version = "0.39.0" }
nu-ansi-term = { version = "0.39.0", path="../nu-ansi-term" }
nu-source = { path="../nu-source", version = "0.43.0" }
nu-ansi-term = { version = "0.43.0", path="../nu-ansi-term" }
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] }
bigdecimal = { package = "bigdecimal", version = "0.3.0", features = ["serde"] }
codespan-reporting = { version="0.11.0", features=["serialization"] }
derive-new = "0.5.8"
getset = "0.1.1"
num-bigint = { version="0.3.1", features=["serde"] }
num-bigint = { version="0.4.3", features=["serde"] }
num-traits = "0.2.14"
serde = { version="1.0.123", features=["derive"] }

View File

@ -4,7 +4,7 @@ description = "Fork of serde-hjson"
edition = "2018"
license = "MIT"
name = "nu-json"
version = "0.39.0"
version = "0.43.0"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -20,6 +20,6 @@ lazy_static = "1"
linked-hash-map = { version="0.5", optional=true }
[dev-dependencies]
nu-path = { version = "0.39.0", path="../nu-path" }
nu-test-support = { version = "0.39.0", path="../nu-test-support" }
nu-path = { version = "0.43.0", path="../nu-path" }
nu-test-support = { version = "0.43.0", path="../nu-test-support" }
serde_json = "1.0.39"

View File

@ -925,11 +925,8 @@ fn escape_char<W>(wr: &mut W, value: char) -> Result<()>
where
W: io::Write,
{
// FIXME: this allocation is required in order to be compatible with stable
// rust, which doesn't support encoding a `char` into a stack buffer.
let mut s = String::new();
s.push(value);
escape_bytes(wr, s.as_bytes())
let mut scratch = [0_u8; 4];
escape_bytes(wr, value.encode_utf8(&mut scratch).as_bytes())
}
fn fmt_f32_or_null<W>(wr: &mut W, value: f32) -> Result<()>

View File

@ -4,23 +4,23 @@ description = "Nushell parser"
edition = "2018"
license = "MIT"
name = "nu-parser"
version = "0.39.0"
version = "0.43.0"
[dependencies]
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] }
bigdecimal = { package = "bigdecimal", version = "0.3.0", features = ["serde"] }
derive-new = "0.5.8"
indexmap = { version="1.6.1", features=["serde-1"] }
log = "0.4"
num-bigint = { version="0.3.1", features=["serde"] }
num-bigint = { version="0.4.3", features=["serde"] }
itertools = "0.10.0"
smart-default = "0.6.0"
nu-errors = { version = "0.39.0", path="../nu-errors" }
nu-data = { version = "0.39.0", path="../nu-data" }
nu-path = { version = "0.39.0", path="../nu-path" }
nu-protocol = { version = "0.39.0", path="../nu-protocol" }
nu-source = { version = "0.39.0", path="../nu-source" }
nu-test-support = { version = "0.39.0", path="../nu-test-support" }
nu-errors = { version = "0.43.0", path="../nu-errors" }
nu-data = { version = "0.43.0", path="../nu-data" }
nu-path = { version = "0.43.0", path="../nu-path" }
nu-protocol = { version = "0.43.0", path="../nu-protocol" }
nu-source = { version = "0.43.0", path="../nu-source" }
nu-test-support = { version = "0.43.0", path="../nu-test-support" }
[features]
stable = []

View File

@ -132,7 +132,7 @@ mod tests {
match spec {
NamedType::Optional(_, _) => {}
_ => panic!("optional flag didn't parse succesfully"),
_ => panic!("optional flag didn't parse successfully"),
}
}
}

View File

@ -901,7 +901,7 @@ fn parse_arg(
return parse_dollar_expr(lite_arg, scope);
}
// before anything else, try to see if this is a number in paranthesis
// before anything else, try to see if this is a number in parenthesis
if lite_arg.item.starts_with('(') {
return parse_full_column_path(lite_arg, scope);
}
@ -1559,11 +1559,16 @@ fn parse_internal_command(
if error.is_none() {
error = err;
}
} else if error.is_none() {
error = Some(ParseError::argument_error(
lite_cmd.parts[0].clone(),
ArgumentError::MissingValueForName(full_name.to_owned()),
));
} else {
if error.is_none() {
error = Some(ParseError::argument_error(
lite_cmd.parts[0].clone(),
ArgumentError::MissingValueForName(
full_name.to_owned(),
),
));
}
break;
}
}
}

View File

@ -75,7 +75,7 @@ fn find_source_file(
let path = canonicalize(&file).map_err(|e| {
ParseError::general_error(
format!("Can't load source file. Reason: {}", e.to_string()),
format!("Can't load source file. Reason: {}", e),
"Can't load this file".spanned(file_span),
)
})?;
@ -85,7 +85,7 @@ fn find_source_file(
match contents {
Ok(contents) => parse(&contents, 0, scope),
Err(e) => Err(ParseError::general_error(
format!("Can't load source file. Reason: {}", e.to_string()),
format!("Can't load source file. Reason: {}", e),
"Can't load this file".spanned(file_span),
)),
}

View File

@ -4,7 +4,7 @@ description = "Path handling library for Nushell"
edition = "2018"
license = "MIT"
name = "nu-path"
version = "0.39.0"
version = "0.43.0"
[dependencies]
dirs-next = "2.0.0"

View File

@ -19,7 +19,7 @@ fn handle_dots_push(string: &mut String, count: u8) {
string.pop(); // remove last '/'
}
/// Expands any occurence of more than two dots into a sequence of ../ (or ..\ on windows), e.g.,
/// Expands any occurrence of more than two dots into a sequence of ../ (or ..\ on windows), e.g.,
/// "..." into "../..", "...." into "../../../", etc.
pub fn expand_ndots(path: impl AsRef<Path>) -> PathBuf {
// Check if path is valid UTF-8 and if not, return it as it is to avoid breaking it via string

View File

@ -4,17 +4,17 @@ description = "Nushell Plugin"
edition = "2018"
license = "MIT"
name = "nu-plugin"
version = "0.39.0"
version = "0.43.0"
[lib]
doctest = false
[dependencies]
nu-errors = { path="../nu-errors", version = "0.39.0" }
nu-protocol = { path="../nu-protocol", version = "0.39.0" }
nu-source = { path="../nu-source", version = "0.39.0" }
nu-test-support = { path="../nu-test-support", version = "0.39.0" }
nu-value-ext = { path="../nu-value-ext", version = "0.39.0" }
nu-errors = { path="../nu-errors", version = "0.43.0" }
nu-protocol = { path="../nu-protocol", version = "0.43.0" }
nu-source = { path="../nu-source", version = "0.43.0" }
nu-test-support = { path="../nu-test-support", version = "0.43.0" }
nu-value-ext = { path="../nu-value-ext", version = "0.43.0" }
indexmap = { version="1.6.1", features=["serde-1"] }
serde = { version="1.0", features=["derive"] }
serde_json = "1.0"

View File

@ -4,7 +4,7 @@ description = "Pretty hex dump of bytes slice in the common style."
edition = "2018"
license = "MIT"
name = "nu-pretty-hex"
version = "0.39.0"
version = "0.43.0"
[lib]
doctest = false
@ -16,11 +16,11 @@ name = "nu_pretty_hex"
path = "src/main.rs"
[dependencies]
nu-ansi-term = { path="../nu-ansi-term", version = "0.39.0" }
nu-ansi-term = { path="../nu-ansi-term", version = "0.43.0" }
rand = "0.8.3"
[dev-dependencies]
heapless = "0.6.1"
heapless = { version = "0.7.8", default-features = false }
# [features]
# default = ["alloc"]

View File

@ -166,7 +166,7 @@ fn test_hex_write_with_simple_config() {
core::str::from_utf8(b"00 01 02 03 04 05 06 07 08 09 0a 0b 0c 0d 0e 0f").unwrap();
// let expected =
// "\u{1b}[38;5;242m00\u{1b}[0m \u{1b}[1;35m01\u{1b}[0m \u{1b}[1;35m02\u{1b}[0m \u{1b}[1;";
let mut buffer = heapless::Vec::<u8, heapless::consts::U50>::new();
let mut buffer = heapless::Vec::<u8, 50>::new();
hex_write(&mut buffer, &bytes, config, None).unwrap();

View File

@ -4,13 +4,13 @@ description = "Core values and protocols for Nushell"
edition = "2018"
license = "MIT"
name = "nu-protocol"
version = "0.39.0"
version = "0.43.0"
[lib]
doctest = false
[dependencies]
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] }
bigdecimal = { package = "bigdecimal", version = "0.3.0", features = ["serde"] }
byte-unit = "4.0.9"
chrono = { version="0.4.19", features=["serde"] }
chrono-humanize = "0.2.1"
@ -18,18 +18,19 @@ derive-new = "0.5.8"
getset = "0.1.1"
indexmap = { version="1.6.1", features=["serde-1"] }
log = "0.4.14"
nu-errors = { path="../nu-errors", version = "0.39.0" }
nu-source = { path="../nu-source", version = "0.39.0" }
num-bigint = { version="0.3.1", features=["serde"] }
nu-errors = { path="../nu-errors", version = "0.43.0" }
nu-source = { path="../nu-source", version = "0.43.0" }
num-bigint = { version = "0.4.3", features = ["serde"] }
num-integer = "0.1.44"
num-traits = "0.2.14"
serde = { version="1.0", features=["derive"] }
serde_bytes = "0.11.5"
[dependencies.polars]
version = "0.16.0"
version = "0.17.0"
optional = true
features = ["default", "serde", "rows", "strings", "checked_arithmetic", "object", "dtype-duration-ns"]
default-features = false
features = ["docs", "zip_with", "csv-file", "temporal", "performant", "pretty_fmt", "dtype-slim", "serde", "rows", "strings", "checked_arithmetic", "object", "dtype-date", "dtype-datetime", "dtype-time"]
[features]
dataframe = ["polars"]

View File

@ -603,7 +603,7 @@ where
{
match series.dtype() {
DataType::UInt32 | DataType::Int32 | DataType::UInt64 => {
let to_i64 = series.cast_with_dtype(&DataType::Int64);
let to_i64 = series.cast(&DataType::Int64);
match to_i64 {
Ok(series) => {
@ -661,7 +661,7 @@ where
{
match series.dtype() {
DataType::Float32 => {
let to_f64 = series.cast_with_dtype(&DataType::Float64);
let to_f64 = series.cast(&DataType::Float64);
match to_f64 {
Ok(series) => {
@ -731,7 +731,7 @@ where
{
match series.dtype() {
DataType::UInt32 | DataType::Int32 | DataType::UInt64 => {
let to_i64 = series.cast_with_dtype(&DataType::Int64);
let to_i64 = series.cast(&DataType::Int64);
match to_i64 {
Ok(series) => {
@ -789,7 +789,7 @@ where
{
match series.dtype() {
DataType::Float32 => {
let to_f64 = series.cast_with_dtype(&DataType::Float64);
let to_f64 = series.cast(&DataType::Float64);
match to_f64 {
Ok(series) => {

View File

@ -8,8 +8,8 @@ use nu_errors::ShellError;
use nu_source::{Span, Tag};
use num_bigint::BigInt;
use polars::prelude::{
DataFrame, DataType, Date64Type, Int64Type, IntoSeries, NamedFrom, NewChunkedArray, ObjectType,
PolarsNumericType, Series, TimeUnit,
DataFrame, DataType, DatetimeChunked, Int64Type, IntoSeries, NamedFrom, NewChunkedArray,
ObjectType, PolarsNumericType, Series,
};
use std::ops::{Deref, DerefMut};
@ -310,8 +310,8 @@ pub fn create_column(
}
}
}
DataType::Date32 => {
let casted = series.date32().map_err(|e| {
DataType::Date => {
let casted = series.date().map_err(|e| {
ShellError::labeled_error(
"Casting error",
format!("casting error: {}", e),
@ -347,8 +347,8 @@ pub fn create_column(
Ok(Column::new(casted.name().into(), values))
}
DataType::Date64 => {
let casted = series.date64().map_err(|e| {
DataType::Datetime => {
let casted = series.datetime().map_err(|e| {
ShellError::labeled_error(
"Casting error",
format!("casting error: {}", e),
@ -384,8 +384,8 @@ pub fn create_column(
Ok(Column::new(casted.name().into(), values))
}
DataType::Time64(timeunit) | DataType::Duration(timeunit) => {
let casted = series.time64_nanosecond().map_err(|e| {
DataType::Time => {
let casted = series.time().map_err(|e| {
ShellError::labeled_error(
"Casting error",
format!("casting error: {}", e),
@ -398,14 +398,7 @@ pub fn create_column(
.skip(from_row)
.take(size)
.map(|v| match v {
Some(a) => {
let nanoseconds = match timeunit {
TimeUnit::Second => a / 1_000_000_000,
TimeUnit::Millisecond => a / 1_000_000,
TimeUnit::Microsecond => a / 1_000,
TimeUnit::Nanosecond => a,
};
Some(nanoseconds) => {
let untagged = if let Some(bigint) = BigInt::from_i64(nanoseconds) {
UntaggedValue::Primitive(Primitive::Duration(bigint))
} else {
@ -633,7 +626,8 @@ pub fn from_parsed_columns(
}
});
let res = ChunkedArray::<Date64Type>::new_from_opt_iter(&name, it);
let res: DatetimeChunked =
ChunkedArray::<Int64Type>::new_from_opt_iter(&name, it).into();
df_series.push(res.into_series())
}

View File

@ -87,7 +87,7 @@ impl PartialEq for NuDataFrame {
// Casting needed to compare other numeric types with nushell numeric type.
// In nushell we only have i64 integer numeric types and any array created
// with nushell untagged primitives will be of type i64
DataType::UInt32 => match self_series.cast_with_dtype(&DataType::Int64) {
DataType::UInt32 => match self_series.cast(&DataType::Int64) {
Ok(series) => series,
Err(_) => return false,
},

View File

@ -6,7 +6,7 @@
macro_rules! out {
($($tokens:tt)*) => {
use std::io::Write;
print!($($tokens)*);
write!(std::io::stdout(), $($tokens)*).unwrap_or(());
let _ = std::io::stdout().flush();
}
}
@ -17,7 +17,12 @@ macro_rules! out {
/// and stray printlns left by accident
#[macro_export]
macro_rules! outln {
($($tokens:tt)*) => { println!($($tokens)*) }
($($tokens:tt)*) => {
{
use std::io::Write;
writeln!(std::io::stdout(), $($tokens)*).unwrap_or(())
}
}
}
/// Outputs to standard error
@ -26,7 +31,12 @@ macro_rules! outln {
/// and stray printlns left by accident
#[macro_export]
macro_rules! errln {
($($tokens:tt)*) => { eprintln!($($tokens)*) }
($($tokens:tt)*) => {
{
use std::io::Write;
writeln!(std::io::stderr(), $($tokens)*).unwrap_or(())
}
}
}
#[macro_export]

Some files were not shown because too many files have changed in this diff Show More