Compare commits

...

118 Commits

Author SHA1 Message Date
JT
a660720b68 Bump to 0.44 (#4365) 2022-02-07 20:15:46 -05:00
265ee1281d Drop with iter range (#4242)
* Allow range in 'drop nth'

* Unit tests for drop nth range

* Add range case to the description

* Fix description 2

* format fixes

* Fix example and some refactoring

* clippy fixes
2022-02-07 08:02:35 -05:00
JT
cdc8e67d61 Remove unused repo parts (#4271)
* Remove unused repo parts

* Update README

* cargo fmt
2022-01-26 07:31:04 +11:00
JT
4e8e03867c Update release.yml 2022-01-19 04:51:21 +11:00
JT
49e8af8ea5 Bump to 0.43 (#4264) 2022-01-18 12:06:12 -05:00
JT
d5d61d14b3 Tutor eq (#4263)
* Fix clippy lints

* Fix clippy lints

* Fix clippy lints

* Add e-q tutor page
2022-01-19 03:22:23 +11:00
JT
f562a4526c Fix clippy lints (#4262)
* Fix clippy lints

* Fix clippy lints

* Fix clippy lints
2022-01-18 23:33:28 +11:00
e6c09f2dfc Update sysinfo version (#4261) 2022-01-18 22:37:52 +11:00
73a68954c4 Bump follow-redirects from 1.14.4 to 1.14.7 in /samples/wasm (#4258)
Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.14.4 to 1.14.7.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.14.4...v1.14.7)

---
updated-dependencies:
- dependency-name: follow-redirects
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-01-18 22:37:06 +11:00
476d543dee Update descriptions for crates split out from nu-cli (#4247)
`nu-command` and `nu-data` were split out, but the descriptions still
say 'CLI'.

Signed-off-by: Michel Alexandre Salim <salimma@fedoraproject.org>
2022-01-09 06:05:50 -06:00
398502b0d6 fix docs/sample_config/config.toml: use env.PROMPT_COMMAND (#4241) 2022-01-02 17:35:07 -06:00
JT
62011b6bcc Bump to 0.42 (#4234) 2021-12-28 20:56:59 +11:00
1214cd57e8 bat: use regex-onig instead of regex-fancy (#4226)
Fixes #4224

Signed-off-by: nibon7 <nibon7@163.com>
2021-12-24 08:34:59 -06:00
6cd124ddb2 allow insecure server connections when using SSL (#4219)
Fixes #4211

Signed-off-by: nibon7 <nibon7@163.com>
2021-12-23 06:48:43 +11:00
d32aec5906 Don't panic if the other end of std{out,err} is closed (#4179)
* fix #4161

println! and friends will panic on BrokenPipe. The solution is to use
writeln! instead, and ignore the error (or do we want to do something else?)

* test that nu doesn't panic in case of BrokenPipe error

* fixup! test that nu doesn't panic in case of BrokenPipe error

* make do_not_panic_if_broken_pipe only run on UNIX systems
2021-12-21 10:08:41 +11:00
e919f9a73b use heck for string casing (#4081)
I removed the Inflector dependency in favor of heck for two reasons:
- to close #3674.
- heck seems simpler and actively maintained

We could probably alter the structure of the `str_` module to expose the
individual casing behaviors better.
I did not feel as confident on changing those signatures.

So I took a lazier approach of a macro in the `mod.rs` that creates the public
shimming function to heck's traits.
2021-12-14 09:43:48 -06:00
a3c349746f ci: update macOS agent (#4207)
10.14 has been deprecated: https://github.com/Azure/azure-sdk-for-cpp/issues/3168

This hopefully fixes recent CI failures!
2021-12-14 08:55:51 -06:00
b5f8f64d79 ci: fix macOS agent (#4203)
I noticed the agent documentation uses uppercase: https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/hosted?view=azure-devops&tabs=yaml
2021-12-13 13:08:03 -06:00
1576b959f9 update feat template (#4201) 2021-12-12 16:15:31 -06:00
4096f52003 update templates2 (#4200) 2021-12-12 16:11:27 -06:00
7ceb668419 Revert "try out title change (#4198)" (#4199)
This reverts commit 420aee18ca.
2021-12-12 16:06:07 -06:00
420aee18ca try out title change (#4198) 2021-12-12 16:05:24 -06:00
pin
15e9c11849 Fix build on NetBSD (#4192) 2021-12-09 14:23:40 +02:00
9fd680ae2b fix: Implicit coercion of boolean false and empty value #4094 (#4120)
Signed-off-by: closetool <c299999999@qq.com>
2021-12-09 14:19:51 +02:00
ad94ed5e13 Fix Configuration section in bug report template (#4181)
* Fix Configuration section in bug report template

Change the placeholder content to actually match the `to md` output, and add `--pretty`

* Don't omit data in placeholder configuration table

* Remove blank line in bug_report.yml
2021-12-08 13:32:28 -06:00
1bdcdcca70 fix: change into column_path to into column-path (breaking change) (#4185) (#4189) 2021-12-08 11:04:55 +02:00
JT
610e3911f6 Bump to 0.41 (#4187) 2021-12-08 06:21:00 +13:00
ee9eddd851 avoid unnecessary allocation (#4178) 2021-12-06 07:38:58 +13:00
JT
c08e145501 Fix clippy warnings (#4176) 2021-12-03 07:05:38 +13:00
c00853a473 Seems like accessing $it outside each is not possible now (#4000) 2021-12-03 06:49:24 +13:00
79c7b20cfd add login shell flag (#4175) 2021-12-02 20:05:04 +13:00
JT
89cbfd758d Remove 'arboard' (#4174) 2021-12-02 08:48:03 +13:00
e6e6b730f3 Bye bye upx sorry (#4173)
* bye bye upx, let's try stripping alone

* remove all stripping - not sure it's even working
2021-11-30 13:34:16 -06:00
0fe6a7c1b5 bye bye upx, let's try stripping alone (#4172) 2021-11-30 12:11:01 -06:00
1794ad51bd Sanitize arguments to external commands a bit better (#4157)
* fix #4140

We are passing commands into a shell underneath but we were not
escaping arguments correctly. This new version of the code also takes
into consideration the ";" and "&" characters, which have special
meaning in shells.

We would probably benefit from a more robust way to join arguments to
shell programs. Python's stdlib has shlex.join, and perhaps we can
take that implementation as a reference.

* clean up escaping of posix shell args

I believe the right place to do escaping of arguments was in the
spawn_sh_command function. Note that this change prevents things like:

^echo "$(ls)"

from executing the ls command. Instead, this will just print

$(ls)

The regex has been taken from the python stdlib implementation of shlex.quote

* fix non-literal parameters and single quotes

* address clippy's comments

* fixup! address clippy's comments

* test that subshell commands are sanitized properly
2021-11-29 09:46:42 -06:00
fb197f562a save --append: create file if it doesn't exist (#4156)
* have save --append create file if not exists

Currently, doing:

echo a | save --raw --append file.txt

will fail if file.txt does not exist. This PR changes that

* test that `save --append` will create new file
2021-11-26 12:27:41 -06:00
91c270c14a fix markup (#4155) 2021-11-26 07:37:50 -06:00
3e93ae8af4 Correct spelling (#4152) 2021-11-25 11:11:20 -06:00
e06df124ca upgrading dependencies (#4135)
* upgrade dependencies
num-bigint 0.3.1 -> 0.4.3
bigdecimal-rs 0.2.1 -> bigdecimal 0.3.0
s3hander 0.7 -> 0.7.5
bat 0.18 -> 0.18, default-features = false

* upgrade arboard 1.1.0 -> 2.0.1

* in polars use comfy-table instead of prettytable-rs
the last release of prettytable-rs was `0.8.0 Sep 27, 2018`
and it uses `term 0.5` as a dependency

* upgrade dependencies

* upgrade trash -> 2.0.1

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-20 07:11:11 -06:00
JT
2590fcbe5c Bump to 0.40 (#4129) 2021-11-16 21:53:03 +13:00
JT
09691ff866 Delete docker-publish.yml 2021-11-16 14:19:35 +13:00
16db368232 upgrade polars to 0.17 (#4122) 2021-11-16 12:01:02 +13:00
JT
df87d90b8c Add 'detect columns' command (#4127)
* Add 'detect columns' command

* Fix warnings
2021-11-16 11:29:54 +13:00
f2f01b8a4d missed from_mp4, added back (#4128) 2021-11-15 16:19:44 -06:00
6c0190cd38 added upx and strip to mac and windows (#4126) 2021-11-15 15:32:48 -06:00
b26246bf12 trying upx and strip (#4125) 2021-11-15 15:01:25 -06:00
36a4effbb2 tweaked strip ci (#4124) 2021-11-15 14:30:32 -06:00
9fca417f8c update release to allow running manually (#4123) 2021-11-15 14:04:00 -06:00
d09e1148b2 add the ability to strip the debug symbols for smaller binaries on mac and linux 2021-11-15 13:47:46 -06:00
493bc2b1c9 Update README (#4118)
`winget install nu` fails because there's other options for "nu" now.
Using the full `nushell` word solved it for me.

[Imgur](https://imgur.com/aqz2qNp)
2021-11-14 19:34:57 +13:00
74b812228c upgrade dependencies (#4116)
* remove unused dependencies

* upgrade dependency bytes 0.5.6 -> 1.1.0

* upgrade dependency heapless 0.6.1 -> 0.7.8

* upgrade dependency image 0.22.4 -> 0.23.14

* upgrade dependency mp4 0.8.2 -> 0.9.0

* upgrade dependency bson 0.14.1 -> 2.0.1

Bson::Undefined, Bson::MaxKey, Bson::MinKey and Bson::DbPointer
weren't present in the previous version.

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-14 19:32:21 +13:00
649b3804c1 fix: panic! during parsing (#4107)
Typing `selector -qa` into nu would cause a `panic!`
This was the case because the inner loop incremented the `idx`
that was only checked in the outer loop and used it to index into
`lite_cmd.parts[idx]`
With the fix we now break loop.

Co-authored-by: ahkrr <alexhk@protonmail.com>
2021-11-05 21:46:46 +13:00
JT
df6a53f52e Update stale.yml (#4106) 2021-11-04 21:25:44 +13:00
JT
c4af5df828 Update stale.yml (#4102) 2021-10-31 16:48:58 +13:00
f94a3e15f5 Get rid of header bold option (#4076)
* refactor(options): get rid of 'header_bold' option

* docs(config): remove 'header_bold' from docs

* fix(options): replicate logic to apply true/false in bold

* style(options): apply lint fixes
2021-10-31 06:59:19 +13:00
75782f0f50 Fix #4070: Inconsistent file matching rule for ls and rm (#4099) 2021-10-28 15:05:07 +03:00
JT
2b06ce27d3 Bump to 0.39 (#4097) 2021-10-27 08:36:41 +13:00
72c241348b Remove dependencies (#4087)
* fix regression

* Removed the nipper dependency

* fix linting

* fix clippy
2021-10-22 06:58:40 +13:00
JT
ab2d2db987 Fix clippy warnings (#4088)
* Fix clippy warnings

* Fix clippy warnings
2021-10-22 06:57:51 +13:00
07e05ef183 fix regression (#4086) 2021-10-19 13:39:23 -05:00
a986de8ad0 Update stale.yml (#4073)
add labels that can exempt from stale bot
2021-10-09 14:50:27 -05:00
22cfe4391e remove history file after clearing it (#4069) 2021-10-07 10:09:31 -05:00
JT
97d17311f4 Update LICENSE (#4067) 2021-10-07 08:42:07 +13:00
0f6fd30619 stale.yml: mention time to close in stale message (#4066) 2021-10-06 09:05:29 -05:00
JT
e1ebd461d2 Bump to 0.28 (#4064) 2021-10-06 06:35:25 +13:00
JT
f000d5d0a1 Remove the broken scrolling support (#4063)
* Remove the broken scrolling support

* Remove the broken scrolling support
2021-10-06 05:57:14 +13:00
574c5961c8 Add -c flag to select command (#4062)
See cc3653cfd9 for more on the `-c` flag.

Co-authored-by: Andrés N. Robalino <andres@androbtech.com>

Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
2021-10-05 13:23:37 +13:00
JT
69708f7244 update wasm deps (#4061) 2021-10-03 07:19:54 +13:00
62c5df5fc6 expand tilde when reading plugin_dirs (#4052) 2021-10-02 21:38:21 +13:00
92c855a412 Fixed two typos in the tutor. (#4051) 2021-10-02 21:37:59 +13:00
d395816929 remove ansi colors if this is not a tty (#4058) 2021-10-01 09:00:08 -05:00
5e34ef6dff new command: into column_path (#4048) 2021-09-29 07:23:34 -05:00
d567c58cc1 Add -c flag to update cells subcommand (#4039)
* Add `-c` flag to `update cells` subcommand

* Fix lints
2021-09-27 21:18:50 -05:00
4e0d7bc77c Less deps (#4038)
* compiles on nightly now. (breaking change)

* less deps

* Switch over to new resolver

(it's been stable for a while.)

* let's leave num-format for another PR
2021-09-28 07:17:00 +13:00
32581497ef Fix 90 degrees tables problem (#4043)
* fix 90 degrees tables problem

* linting

* clippy

* linting
2021-09-25 14:05:45 -05:00
d6df367c6b Corrected typo (#4040)
It is not BSON but SQLite
2021-09-25 04:25:00 -05:00
4e6327de1d Added BigInt handling to the delimited file format for the 'to' command (#4034)
Co-authored-by: patrick <patrick@spol42069.hitronhub.home>
2021-09-25 09:47:16 +12:00
b3d8666db0 compiles on nightly now. (breaking change) (#4037) 2021-09-25 09:46:48 +12:00
1de7c3d033 Scraping multiple tables (#4036)
* Output error when ls into a file without permission

* math sqrt

* added test to check fails when ls into prohibited dir

* fix lint

* math sqrt with tests and doc

* trigger wasm build

* Update filesystem_shell.rs

* Fix Running echo .. starts printing integers forever

* Allow for multiple table scraping

* linting

* Fix clippy

* linting

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2021-09-24 08:08:13 -05:00
962b258cc6 merge span (#4031) 2021-09-23 07:48:05 +12:00
59697cab63 force rebuild of dev container (#4033) 2021-09-23 07:47:28 +12:00
349af05da8 Do not throw error for files not found in lib_dirs (#4029) 2021-09-20 13:44:47 -05:00
JT
b3b3cf0689 Remove the docker instructions
Docker has been out of date for a long time, go ahead and remove.
2021-09-20 19:33:49 +12:00
5d59234f8d Flexibility updating table's cells. (#4027)
Very often we need to work with tables (say extracted from unstructured data or some
kind of final report, timeseries, and the like).

It's inevitable we will be having columns that we can't know beforehand what their names
will be, or how many.

Also, we may end up with certain cells having values we may want to remove as we explore.

Here, `update cells` fundamentally goes over every cell in the table coming in and updates
the cell's contents with the output of the block passed. Basic example here:

```
> [

    [   ty1,       t2,       ty];

    [     1,        a, $nothing]
    [(wrap), (0..<10),      1Mb]
    [    1s,     ({}),  1000000]
    [ $true,   $false,   ([[]])]

] | update cells { describe }

───┬───────────────────────┬───────────────────────────┬──────────
 # │          ty1          │            t2             │    ty
───┼───────────────────────┼───────────────────────────┼──────────
 0 │ integer               │ string                    │ nothing
 1 │ row Column(table of ) │ range[[integer, integer)] │ filesize
 2 │ string                │ nothing                   │ integer
 3 │ boolean               │ boolean                   │ table of
───┴───────────────────────┴───────────────────────────┴──────────
```

and another one (in the examples) for cases, say we have a timeseries table generated and
we want to remove the zeros and have empty strings and save it out to something like CSV.

```
> [
    [2021-04-16, 2021-06-10, 2021-09-18, 2021-10-15, 2021-11-16, 2021-11-17, 2021-11-18];
    [        37,          0,          0,          0,         37,          0,          0]
] | update cells {|value| i
  if ($value | into int) == 0 {
    ""
  } {
    $value
  }
}

───┬────────────┬────────────┬────────────┬────────────┬────────────┬────────────┬────────────
 # │ 2021-04-16 │ 2021-06-10 │ 2021-09-18 │ 2021-10-15 │ 2021-11-16 │ 2021-11-17 │ 2021-11-18
───┼────────────┼────────────┼────────────┼────────────┼────────────┼────────────┼────────────
 0 │         37 │            │            │            │         37 │            │
───┴────────────┴────────────┴────────────┴────────────┴────────────┴────────────┴────────────
```
2021-09-19 15:37:54 -05:00
Tw
4f7b423f36 Support completion when cursor inside an argument (#4023)
* Support completion when cursor inside an argument

Bash supports completion even when cursor is in an argument, this is very useful for some fixup after the initial completion.
Let add this feature as well.

Signed-off-by: Tw <wei.tan@intel.com>

* Add test for when cursor inside an argument

To support test this case, let's also take the position into account.

Signed-off-by: Tw <wei.tan@intel.com>
2021-09-19 17:23:05 +12:00
f7043bf690 Fix #3090: let binding in command leaks when error occurs (#4022) 2021-09-19 14:57:20 +12:00
Tw
1297499d7a add command g to switch shell quickly (#4014)
Signed-off-by: Tw <tw19881113@gmail.com>
2021-09-17 10:39:14 +01:00
bd0baa961c add table selector for downloading web tables (#4004)
* add table selector for downloading web tables

* type-o

* updated debug mode to inspect mode
2021-09-16 09:02:30 -05:00
4ee536f044 fix: enable SIMD (#4021) 2021-09-16 20:01:42 +12:00
JT
8581bec891 bump 0.37.1 (#4019) 2021-09-16 13:32:22 +12:00
8bcbc8eeb3 Move nu-path tests to integration tests (#4015)
* Move nu-path tests to integration tests

To prevent circular dependency between nu-path and nu-test-support crates.

* Fmt
2021-09-16 07:11:28 +12:00
c164ef5489 Update to polars 0.16 (#4013)
* update to polars 0.16

* enabled features for polars
2021-09-16 07:10:12 +12:00
cc3653cfd9 Path commands: Put column path args behid flag; Allow path join appending without flag (#4008)
* Change path join signature

* Appending now works without flag
* Column path operation is behind a -c flag

* Move column path arg retrieval to a function

Also improves errors

* Fix path join tests

* Propagate column path changes to all path commands

* Update path command examples with columns paths

* Modernize path command examples by removing "echo"

* Improve structured path error message

* Fix typo
2021-09-15 21:03:51 +03:00
JT
7fc65067cf Temporarily remove the circular dep (#4009) 2021-09-15 09:17:31 +12:00
JT
f9ae882012 Update main.wxs (#4007)
Remove references to the old binaries
2021-09-15 07:45:30 +12:00
JT
1d80a68f4c bump to 0.37 (#4006) 2021-09-15 06:44:24 +12:00
fda69354db change name to command_prompt (#4003) 2021-09-14 08:02:10 +12:00
cc5c4d38bb Small fixes and refactors to paths & source command (#3998)
* Expand path when converting value -> PathBuf

Also includes Tagged<PathBuf>.

Fixes #3605

* Expand path for PATH env. variable

Fixes #1834

* Remove leftover Cows after nu-path refactor

There were some unnecessary Cow conversions leftover from the old
nu-path implementation.

* Use canonicalize in source command; Improve errors

Previously, `source` used `expand_path()` which does not follow
symlinks.

As a follow up, I improved the source error messages so they now tell
why the source file could not be canonicalized or read into string.
2021-09-12 02:36:14 +03:00
JT
0fa0c25fb3 Fix clippy warnings (#3997) 2021-09-10 13:13:11 +12:00
55eafadf02 Improve error message when bash-style alias syntax is mistakenly used (#3995) 2021-09-10 10:44:55 +12:00
51c74eebd0 Add general refactorings (#3996) 2021-09-10 10:44:22 +12:00
Tw
ae9f4135c0 support appending when saving file (#3992)
This patch implements `>>` operation in bash.

Signed-off-by: Tw <tw19881113@gmail.com>
2021-09-05 06:12:08 +12:00
4e2d3ceaaf Allow knowing the command name tag given no input. (#3988)
```
tags
```
2021-09-03 01:46:15 -05:00
c9c6bd4836 Create errors from tables. (#3986)
```
> [
  [          msg,                 labels,                      span];
  ["The message", "Helpful message here", ([[start, end]; [0, 141]])]
] | error make

error: The message
  ┌─ shell:1:1
  │
1 │ ╭ [
2 │ │   [          msg,                 labels,                      span];
3 │ │   ["The message", "Helpful message here", ([[start, end]; [0, 141]])]
  │ ╰─────────────────────────────────────────────────────────────────────^ Helpful message here
```

Adding a more flexible approach for creating error values. One use case, for instance is the
idea of a test framework. A failed assertion instead of printing to the screen it could create
tables with more details of the failed assertion and pass it to this command for making a full
fledge error that Nu can show. This can (and should) be extended for capturing error values as well
in the pipeline. One could also use it for inspection.

For example: `.... | error inspect { # inspection here }`

or "error handling" as well, like so: `.... | error capture { fix here }`

However, we start here only with `error make` that creates an error value for you with limited support for the time being.
2021-09-02 21:07:26 -05:00
d90420ac4c Add subcommand into filesize (#3987)
* Add subcommand `into filesize`

It's currently not possible to convert a number or a string containing a number
into a filesize. The only way to create an instance of filesize type today is
with a literal in nushell syntax. This commit adds the `into filesize`
subcommand so that file sizes can be created from the outputs of programs
producing numbers or strings, like standard unix tools.

There is a limitation with this - it doesn't currently parse values like `10 MB`
or `10 MiB`, it can only look at the number itself. If the desire is there, more
flexible parsing can be added.

* fixup! Add subcommand `into filesize`

* fixup! Add subcommand `into filesize`
2021-09-02 18:19:54 -05:00
260ff99710 feat: spawn the executables directly if possible (#3974)
* feat: spawn the executables directly if possible

This pull request changes nu-command so that it spawns the process directly if:
- They are a `.exe` on Windows
- They are not a `.sh` or `.bash` on not windows.

Benefits:
- As I explained in [this comment](https://github.com/nushell/nushell/issues/3898#issuecomment-894000812), this is another step towards making Nushell a standalone shell, that doesn't need to shell out unless it is running a script for a particular shell (cmd, sh, ps1, etc.).
- Fixes the bug with multiline strings
- Better performance due to direct spawning.

For example, this script shows ~20 ms less latency.
After:
```nu
C:\> benchmark { node -e 'console.log("sssss")' }
───┬──────────────────
 # │    real time
───┼──────────────────
 0 │ 63ms 921us 600ns
───┴──────────────────
```
Before
```nu
C:\> benchmark { node -e 'console.log("sssss")' }
───┬──────────────────
 # │    real time
───┼──────────────────
 0 │ 79ms 136us 800ns
───┴──────────────────
```

Fixes #3898

* fix: make which dependency optional

Also fixes clippy warnings

* refactor: refactor spawn_exe, spawn_cmd, spawn_sh, and spawn_any

* fix: use which feature instead of which-support

* fix: use which_in to use the cwd of nu

* fix: use case insensitive comparison of the extensions

Sometimes the case of the extension is uppercased by the "which_in" function

Also use unix instead of not windows. Some os might not have sh support
2021-09-01 09:38:52 -05:00
JT
08014c6a98 Move sys, ps, fetch, post to internal commands (#3983)
* Move sys, ps, fetch, post to internal commands

* Remove old plugins

* clippy

Co-authored-by: JT <jonatha.d.turner@gmail.com>
2021-09-01 14:29:09 +12:00
66cedf0b3a Update char_.rs (#3975)
added a few more chars and abbreviations
2021-08-29 08:40:28 -05:00
707a4ebc15 added more escapes to support ansi art (#3973)
* added more escapes to support ansi art

* fixed some bugs
2021-08-28 14:58:59 -05:00
d95375d494 nu-path crate refactor (#3730)
* Resolve rebase artifacts

* Remove leftover dependencies on removed feature

* Remove unnecessary 'pub'

* Start taking notes and fooling around

* Split canonicalize to two versions; Add TODOs

One that takes `relative_to` and one that doesn't.
More TODO notes.

* Merge absolutize to and rename resolve_dots

* Add custom absolutize fn and use it in path expand

* Convert a couple of dunce::canonicalize to ours

* Update nu-path description

* Replace all canonicalize with nu-path version

* Remove leftover dunce dependencies

* Fix broken autocd with trailing slash

Trailing slash is preserved *only* in paths that do not contain "." or
"..". This should be fixed in the future to cover all paths but for now
it at least covers basic cases.

* Use dunce::canonicalize for canonicalizing

* Alow cd recovery from non-existent cwd

* Disable removed canonicalize functionality tests

Remove unused import

* Break down nu-path into separate modules

* Remove unused public imports

* Remove abundant cow mapping

* Fix clippy warning

* Reformulate old canonicalize tests to expand_path

They wouldn't work with the new canonicalize.

* Canonicalize also ~ and ndots; Unify path joining

Also, add doc comments in nu_path::expansions.

* Add comment

* Avoid expanding ndots if path is not valid UTF-8

With this change, no lossy path->string conversion should happen in the
nu-path crate.

* Fmt

* Slight expand_tilde refactor; Add doc comments

* Start nu-path integration tests

* Add tests TODO

* Fix docstring typo

* Fix some doc strings

* Add README for nu-path crate

* Add a couple of canonicalize tests

* Add nu-path integration tests

* Add trim trailing slashes tests

* Update nu-path dependency

* Remove unused import

* Regenerate lockfile
2021-08-28 15:59:09 +03:00
1c1c58e802 Remove duplicate dependencies (#3961)
* chore: Replace surf with reqwest

Removes a lot of older, duplication versions of some dependencies
(roughtly 90 dependencies removed in total)

* chore: Remove syn 0.11

* chore: Remove unnecessary features from ptree

Removes some more duplicate dependencies

* cargo update

* Ensure we run the fetch and post plugins on the tokio runtime

* Fix clippy warning

* fix: Github requires a user agent on requests

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
2021-08-28 15:34:11 +12:00
JT
7fe05b8296 bump to 0.36.1 (#3972) 2021-08-27 20:48:58 +12:00
17ef531905 introducing the find command (#3971)
* introducing the `find` command

* added tests

* merged main to accomodate "rest" changes

* test fix
2021-08-27 20:48:41 +12:00
b8e2bdd6b1 Allow different names for ...rest (#3954)
* Allow different names for ...rest

* Resolves #3945

* This change requires an explicit name for the rest argument in `WholeStreamCommand`,
  which is why there are so many changed files.

* Remove redundant clone

* Add tests
2021-08-27 05:58:53 +12:00
88817a8f10 Allow environment variables to be hidden (#3950)
* Allow environment variables to be hidden

This change allows environment variables in Nushell to have a value of
`Nothing`, which can be set by the user by passing `$nothing` to
`let-env` and friends.

Environment variables with a value of Nothing behave as if they are not
set at all. This allows a user to shadow the value of an environment
variable in a parent scope, effectively removing it from their current
scope. This was not possible before, because a scope can not affect its
parent scopes.

This is a workaround for issues like #3920.

Additionally, this allows a user to simultaneously set, change and
remove multiple environment variables via `load-env`. Any environment
variables set to $nothing will be hidden and thus act as if they are
removed. This simplifies working with virtual environments, which rely
on setting multiple environment variables, including PATH, to specific
values, and remove/change them on deactivation.

One surprising behavior is that an environment variable set to $nothing
will act as if it is not set when querying it (via $nu.env.X), but it is
still possible to remove it entirely via `unlet-env`. If the same
environment variable is present in the parent scope, the value in the
parent scope will be visible to the user. This might be surprising
behavior to users who are not familiar with the implementation details.

An additional corner case is the the shorthand form of `with-env` does
not work with this feature. Using `X=$nothing` will set $nu.env.X to the
string "$nothing". The long-form works as expected: `with-env [X
$nothing] {...}`.

* Remove unused import

* Allow all primitives to be convert to strings
2021-08-26 08:15:58 -05:00
3e8ce43dcb rename command and rename for melt (#3968) 2021-08-26 08:13:54 -05:00
9d8845d7ad Allow custom lib dir path for sourcing nu script libraries. (#3940)
Given we can write nu scripts. As the codebase grows, splitting into many smaller nu scripts is necessary.

In general, when we work with paths and files we seem to face quite a few difficulties. Here we just tackle one of them and it involves sourcing
files that also source other nu files and so forth. The current working directory becomes important here and being on a different directory
when sourcing scripts will not work. Mostly because we expand the path on the current working directory and parse the files when a source command
call is done.

For the moment, we introduce a `lib_dirs` configuration value and, unfortunately, introduce a new dependency in `nu-parser` (`nu-data`) to get
a handle of the configuration file to retrieve it. This should give clues and ideas as the new parser engine continues (introduce a way to also know paths)

With this PR we can do the following:

Let's assume we want to write a nu library called `my_library`. We will have the code in a directory called `project`: The file structure will looks like this:

```
project/my_library.nu
project/my_library/hello.nu
project/my_library/name.nu
```

This "pattern" works well, that is, when creating a library have a directory named `my_library` and next to it a `my_library.nu` file. Filling them like this:

```

source my_library/hello.nu
source my_library/name.nu
```

```

def hello [] {
  "hello world"
}
```

```

def name [] {
  "Nu"
end
```

Assuming this `project` directory is stored at `/path/to/lib/project`, we can do:

```
config set lib_dirs ['path/to/lib/project']
```

Given we have this `lib_dirs` configuration value, we can be anywhere while using Nu and do the following:

```
source my_library.nu

echo (hello) (name)

```
2021-08-26 02:04:04 -05:00
52578ba483 tweak the version | pivot instructions (#3964) 2021-08-24 19:03:07 -05:00
449 changed files with 8689 additions and 7032 deletions

View File

@ -16,7 +16,7 @@ strategy:
image: ubuntu-18.04
style: 'wasm'
macos-stable:
image: macos-10.14
image: macOS-10.15
style: 'unflagged'
windows-stable:
image: windows-2019

View File

@ -1,165 +0,0 @@
# CircleCI 2.0 configuration file
#
# Check https://circleci.com/docs/2.0/configuration-reference/ for more details
# See https://circleci.com/docs/2.0/config-intro/#section=configuration for spec
#
version: 2.1
# Commands
commands:
pull_cache:
description: Pulls Quay.io docker images (latest) for our cache
parameters:
tag:
type: string
default: "devel"
steps:
- run: echo "Tag is << parameters.tag >>"
- run: docker pull quay.io/nushell/nu:<< parameters.tag >>
- run: docker pull quay.io/nushell/nu-base:<< parameters.tag >>
orbs:
# https://circleci.com/orbs/registry/orb/circleci/docker
docker: circleci/docker@0.5.13
workflows:
version: 2.0
# This builds on all pull requests to test, and ignores main
build_without_deploy:
jobs:
- docker/publish:
deploy: false
image: nushell/nu-base
tag: latest
dockerfile: docker/Dockerfile.nu-base
extra_build_args: --cache-from=quay.io/nushell/nu-base:devel
filters:
branches:
ignore:
- main
before_build:
- pull_cache
after_build:
- run:
name: Build Multistage (smaller) container
command: |
docker build -f docker/Dockerfile -t quay.io/nushell/nu .
- run:
name: Preview Docker Tag for Nushell Build
command: |
DOCKER_TAG=$(docker run quay.io/nushell/nu --version | cut -d' ' -f2)
echo "Version that would be used for Docker tag is v${DOCKER_TAG}"
- run:
name: Test Executable
command: |
docker run --rm quay.io/nushell/nu-base --help
docker run --rm quay.io/nushell/nu --help
# workflow publishes to Docker Hub, with each job having different triggers
build_with_deploy:
jobs:
# Deploy versioned and latest images on tags (releases) only - builds --release.
- docker/publish:
image: nushell/nu-base
registry: quay.io
tag: latest
dockerfile: docker/Dockerfile.nu-base
extra_build_args: --cache-from=quay.io/nushell/nu-base:latest,quay.io/nushell/nu:latest --build-arg RELEASE=true
filters:
branches:
ignore: /.*/
tags:
only: /^\d+\.\d+\.\d+$/
before_build:
- run: docker pull quay.io/nushell/nu:latest
- run: docker pull quay.io/nushell/nu-base:latest
after_build:
- run:
name: Build Multistage (smaller) container
command: |
docker build -f docker/Dockerfile -t quay.io/nushell/nu .
- run:
name: Test Executable
command: |
docker run --rm quay.io/nushell/nu --help
docker run --rm quay.io/nushell/nu-base --help
- run:
name: Publish Docker Tag with Nushell Version
command: |
DOCKER_TAG=$(docker run quay.io/nushell/nu --version | cut -d' ' -f2)
echo "Version for Docker tag is ${DOCKER_TAG}"
docker tag quay.io/nushell/nu-base:latest quay.io/nushell/nu-base:${DOCKER_TAG}
docker tag quay.io/nushell/nu:latest quay.io/nushell/nu:${DOCKER_TAG}
docker push quay.io/nushell/nu-base
docker push quay.io/nushell/nu
# publish devel to Docker Hub on merge to main (doesn't build --release)
build_with_deploy_devel:
jobs:
# Deploy devel tag on merge to main
- docker/publish:
image: nushell/nu-base
registry: quay.io
tag: devel
dockerfile: docker/Dockerfile.nu-base
extra_build_args: --cache-from=quay.io/nushell/nu-base:devel
before_build:
- pull_cache
filters:
branches:
only: main
after_build:
- run:
name: Build Multistage (smaller) container
command: |
docker build --build-arg FROMTAG=devel -f docker/Dockerfile -t quay.io/nushell/nu:devel .
- run:
name: Test Executable
command: |
docker run --rm quay.io/nushell/nu:devel --help
docker run --rm quay.io/nushell/nu-base:devel --help
- run:
name: Publish Development Docker Tags
command: |
docker push quay.io/nushell/nu-base:devel
docker push quay.io/nushell/nu:devel
nightly:
triggers:
- schedule:
cron: "0 0 * * *"
filters:
branches:
only:
- main
jobs:
- docker/publish:
image: nushell/nu-base
registry: quay.io
tag: nightly
dockerfile: docker/Dockerfile.nu-base
extra_build_args: --cache-from=quay.io/nushell/nu-base:nightly --build-arg RELEASE=true
before_build:
- run: docker pull quay.io/nushell/nu:nightly
- run: docker pull quay.io/nushell/nu-base:nightly
after_build:
- run:
name: Build Multistage (smaller) container
command: |
docker build -f docker/Dockerfile -t quay.io/nushell/nu:nightly .
- run:
name: Test Executable
command: |
docker run --rm quay.io/nushell/nu:nightly --help
docker run --rm quay.io/nushell/nu-base:nightly --help
- run:
name: Publish Nightly Nushell Containers
command: |
docker push quay.io/nushell/nu-base:nightly
docker push quay.io/nushell/nu:nightly

View File

@ -1 +0,0 @@
target

View File

@ -1,11 +1,11 @@
name: Bug Report
description: Create a report to help us improve
body:
body:
- type: textarea
id: description
attributes:
label: Describe the bug
description: A clear and concise description of what the bug is.
description: Thank you for your bug report. We are working diligently with our community to integrate our latest code base that we call [engine-q](https://github.com/nushell/engine-q). We would like your help with this by checking to see if this bug report is still needed in engine-q. Thank you for your patience while we ready the next version of nushell.
validations:
required: true
- type: textarea
@ -38,22 +38,20 @@ body:
id: config
attributes:
label: Configuration
description: "Please run `> version | pivot` and paste the output to show OS, features, etc"
description: "Please run `version | pivot key value | to md --pretty` and paste the output to show OS, features, etc."
placeholder: |
> version | pivot key value | to md
╭───┬────────────────────┬───────────────────────────────────────────────────────────────────────╮
│ # │ key │ value │
├───┼────────────────────┼───────────────────────────────────────────────────────────────────────┤
│ 0 │ version │ 0.24.1
│ 1 │ build_os │ macos-x86_64 │
│ 2 │ rust_version │ rustc 1.48.0
│ 3 │ cargo_version │ cargo 1.48.0
│ 4 │ pkg_version │ 0.24.1
│ 5 │ build_time │ 2020-12-18 09:54:09 │
│ 6 │ build_rust_channel │ release
│ 7 │ features │ ctrlc, default, directories, dirs, git, ichwh, rich-benchmark,
│ │ │ rustyline, term, uuid, which, zip │
╰───┴────────────────────┴───────────────────────────────────────────────────────────────────────╯
> version | pivot key value | to md --pretty
| key | value |
| ------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| version | 0.40.0 |
| build_os | linux-x86_64 |
| rust_version | rustc 1.56.1 |
| cargo_version | cargo 1.56.0 |
| pkg_version | 0.40.0 |
| build_time | 1980-01-01 00:00:00 +00:00 |
| build_rust_channel | release |
| features | clipboard-cli, ctrlc, dataframe, default, rustyline, term, trash, uuid, which, zip |
| installed_plugins | binaryview, chart bar, chart line, fetch, from bson, from sqlite, inc, match, post, ps, query json, s3, selector, start, sys, textview, to bson, to sqlite, tree, xpath |
validations:
required: false
- type: textarea

View File

@ -5,7 +5,7 @@ body:
id: problem
attributes:
label: Related problem
description: Is your feature request related to a problem? Please describe.
description: Thank you for your feature request. We are working diligently with our community to integrate our latest code base that we call [engine-q](https://github.com/nushell/engine-q). We would like your help with this by checking to see if this feature request is still needed in engine-q. Thank you for your patience while we ready the next version of nushell.
placeholder: |
A clear and concise description of what the problem is.
Example: I am trying to do [...] but [...]

View File

@ -1,118 +0,0 @@
name: Publish consumable Docker images
on:
push:
tags: ['v?[0-9]+.[0-9]+.[0-9]+*']
jobs:
compile:
runs-on: ubuntu-latest
strategy:
matrix:
arch:
- x86_64-unknown-linux-musl
- x86_64-unknown-linux-gnu
steps:
- uses: actions/checkout@v2
- name: Install rust-embedded/cross
env: { VERSION: v0.1.16 }
run: >-
wget -nv https://github.com/rust-embedded/cross/releases/download/${VERSION}/cross-${VERSION}-x86_64-unknown-linux-gnu.tar.gz
-O- | sudo tar xz -C /usr/local/bin/
- name: compile for specific target
env: { arch: '${{ matrix.arch }}' }
run: |
cross build --target ${{ matrix.arch }} --release
# leave only the executable file
rm -frd target/${{ matrix.arch }}/release/{*/*,*.d,*.rlib,.fingerprint}
find . -empty -delete
- uses: actions/upload-artifact@master
with:
name: ${{ matrix.arch }}
path: target/${{ matrix.arch }}/release
docker:
name: Build and publish docker images
needs: compile
runs-on: ubuntu-latest
env:
DOCKER_REGISTRY: quay.io/nushell
DOCKER_PASSWORD: ${{ secrets.DOCKER_REGISTRY }}
DOCKER_USER: ${{ secrets.DOCKER_USER }}
strategy:
matrix:
tag:
- alpine
- slim
- debian
- glibc-busybox
- musl-busybox
- musl-distroless
- glibc-distroless
- glibc
- musl
include:
- { tag: alpine, base-image: alpine, arch: x86_64-unknown-linux-musl, plugin: true, use-patch: false}
- { tag: slim, base-image: 'debian:stable-slim', arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: debian, base-image: debian, arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: glibc-busybox, base-image: 'busybox:glibc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: musl-busybox, base-image: 'busybox:musl', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: musl-distroless, base-image: 'gcr.io/distroless/static', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: glibc-distroless, base-image: 'gcr.io/distroless/cc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: glibc, base-image: scratch, arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: false}
- { tag: musl, base-image: scratch, arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
steps:
- uses: actions/checkout@v2
- uses: actions/download-artifact@master
with: { name: '${{ matrix.arch }}', path: target/release }
- name: Build and publish exact version
run: |-
export DOCKER_TAG=${GITHUB_REF##*/}-${{ matrix.tag }}
export NU_BINS=target/release/$( [ ${{ matrix.plugin }} = true ] && echo nu* || echo nu )
export PATCH=$([ ${{ matrix.use-patch }} = true ] && echo .${{ matrix.tag }} || echo '')
chmod +x $NU_BINS
echo ${DOCKER_PASSWORD} | docker login ${DOCKER_REGISTRY} -u ${DOCKER_USER} --password-stdin
docker-compose --file docker/docker-compose.package.yml build
docker-compose --file docker/docker-compose.package.yml push # exact version
env:
BASE_IMAGE: ${{ matrix.base-image }}
#region semantics tagging
- name: Retag and push with suffixed version
run: |-
VERSION=${GITHUB_REF##*/}
latest_version=${VERSION%%%.*}-${{ matrix.tag }}
latest_feature=${VERSION%%.*}-${{ matrix.tag }}
latest_patch=${VERSION%.*}-${{ matrix.tag }}
exact_version=${VERSION}-${{ matrix.tag }}
tags=( ${latest_version} ${latest_feature} ${latest_patch} ${exact_version} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
docker push ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
- name: Retag and push debian as latest
if: matrix.tag == 'debian'
run: |-
VERSION=${GITHUB_REF##*/}
# ${latest features} ${latest patch} ${exact version}
tags=( ${VERSION%%.*} ${VERSION%.*} ${VERSION} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:latest
docker push ${DOCKER_REGISTRY}/nu:latest
#endregion semantics tagging

View File

@ -1,8 +1,9 @@
name: Create Release Draft
on:
workflow_dispatch:
push:
tags: ['[0-9]+.[0-9]+.[0-9]+*']
tags: ["[0-9]+.[0-9]+.[0-9]+*"]
jobs:
linux:
@ -28,6 +29,60 @@ jobs:
command: build
args: --release --all --features=extra
# - name: Strip binaries (nu)
# run: strip target/release/nu
# - name: Strip binaries (nu_plugin_inc)
# run: strip target/release/nu_plugin_inc
# - name: Strip binaries (nu_plugin_match)
# run: strip target/release/nu_plugin_match
# - name: Strip binaries (nu_plugin_textview)
# run: strip target/release/nu_plugin_textview
# - name: Strip binaries (nu_plugin_binaryview)
# run: strip target/release/nu_plugin_binaryview
# - name: Strip binaries (nu_plugin_chart_bar)
# run: strip target/release/nu_plugin_chart_bar
# - name: Strip binaries (nu_plugin_chart_line)
# run: strip target/release/nu_plugin_chart_line
# - name: Strip binaries (nu_plugin_from_bson)
# run: strip target/release/nu_plugin_from_bson
# - name: Strip binaries (nu_plugin_from_sqlite)
# run: strip target/release/nu_plugin_from_sqlite
# - name: Strip binaries (nu_plugin_from_mp4)
# run: strip target/release/nu_plugin_from_mp4
# - name: Strip binaries (nu_plugin_query_json)
# run: strip target/release/nu_plugin_query_json
# - name: Strip binaries (nu_plugin_s3)
# run: strip target/release/nu_plugin_s3
# - name: Strip binaries (nu_plugin_selector)
# run: strip target/release/nu_plugin_selector
# - name: Strip binaries (nu_plugin_start)
# run: strip target/release/nu_plugin_start
# - name: Strip binaries (nu_plugin_to_bson)
# run: strip target/release/nu_plugin_to_bson
# - name: Strip binaries (nu_plugin_to_sqlite)
# run: strip target/release/nu_plugin_to_sqlite
# - name: Strip binaries (nu_plugin_tree)
# run: strip target/release/nu_plugin_tree
# - name: Strip binaries (nu_plugin_xpath)
# run: strip target/release/nu_plugin_xpath
- name: Create output directory
run: mkdir output
@ -70,6 +125,60 @@ jobs:
command: build
args: --release --all --features=extra
# - name: Strip binaries (nu)
# run: strip target/release/nu
# - name: Strip binaries (nu_plugin_inc)
# run: strip target/release/nu_plugin_inc
# - name: Strip binaries (nu_plugin_match)
# run: strip target/release/nu_plugin_match
# - name: Strip binaries (nu_plugin_textview)
# run: strip target/release/nu_plugin_textview
# - name: Strip binaries (nu_plugin_binaryview)
# run: strip target/release/nu_plugin_binaryview
# - name: Strip binaries (nu_plugin_chart_bar)
# run: strip target/release/nu_plugin_chart_bar
# - name: Strip binaries (nu_plugin_chart_line)
# run: strip target/release/nu_plugin_chart_line
# - name: Strip binaries (nu_plugin_from_bson)
# run: strip target/release/nu_plugin_from_bson
# - name: Strip binaries (nu_plugin_from_sqlite)
# run: strip target/release/nu_plugin_from_sqlite
# - name: Strip binaries (nu_plugin_from_mp4)
# run: strip target/release/nu_plugin_from_mp4
# - name: Strip binaries (nu_plugin_query_json)
# run: strip target/release/nu_plugin_query_json
# - name: Strip binaries (nu_plugin_s3)
# run: strip target/release/nu_plugin_s3
# - name: Strip binaries (nu_plugin_selector)
# run: strip target/release/nu_plugin_selector
# - name: Strip binaries (nu_plugin_start)
# run: strip target/release/nu_plugin_start
# - name: Strip binaries (nu_plugin_to_bson)
# run: strip target/release/nu_plugin_to_bson
# - name: Strip binaries (nu_plugin_to_sqlite)
# run: strip target/release/nu_plugin_to_sqlite
# - name: Strip binaries (nu_plugin_tree)
# run: strip target/release/nu_plugin_tree
# - name: Strip binaries (nu_plugin_xpath)
# run: strip target/release/nu_plugin_xpath
- name: Create output directory
run: mkdir output
@ -106,7 +215,7 @@ jobs:
uses: actions-rs/cargo@v1
with:
command: install
args: cargo-wix
args: cargo-wix --version 0.3.1
- name: Build
uses: actions-rs/cargo@v1
@ -114,6 +223,60 @@ jobs:
command: build
args: --release --all --features=extra
# - name: Strip binaries (nu.exe)
# run: strip target/release/nu.exe
# - name: Strip binaries (nu_plugin_inc.exe)
# run: strip target/release/nu_plugin_inc.exe
# - name: Strip binaries (nu_plugin_match.exe)
# run: strip target/release/nu_plugin_match.exe
# - name: Strip binaries (nu_plugin_textview.exe)
# run: strip target/release/nu_plugin_textview.exe
# - name: Strip binaries (nu_plugin_binaryview.exe)
# run: strip target/release/nu_plugin_binaryview.exe
# - name: Strip binaries (nu_plugin_chart_bar.exe)
# run: strip target/release/nu_plugin_chart_bar.exe
# - name: Strip binaries (nu_plugin_chart_line.exe)
# run: strip target/release/nu_plugin_chart_line.exe
# - name: Strip binaries (nu_plugin_from_bson.exe)
# run: strip target/release/nu_plugin_from_bson.exe
# - name: Strip binaries (nu_plugin_from_sqlite.exe)
# run: strip target/release/nu_plugin_from_sqlite.exe
# - name: Strip binaries (nu_plugin_from_mp4.exe)
# run: strip target/release/nu_plugin_from_mp4.exe
# - name: Strip binaries (nu_plugin_query_json.exe)
# run: strip target/release/nu_plugin_query_json.exe
# - name: Strip binaries (nu_plugin_s3.exe)
# run: strip target/release/nu_plugin_s3.exe
# - name: Strip binaries (nu_plugin_selector.exe)
# run: strip target/release/nu_plugin_selector.exe
# - name: Strip binaries (nu_plugin_start.exe)
# run: strip target/release/nu_plugin_start.exe
# - name: Strip binaries (nu_plugin_to_bson.exe)
# run: strip target/release/nu_plugin_to_bson.exe
# - name: Strip binaries (nu_plugin_to_sqlite.exe)
# run: strip target/release/nu_plugin_to_sqlite.exe
# - name: Strip binaries (nu_plugin_tree.exe)
# run: strip target/release/nu_plugin_tree.exe
# - name: Strip binaries (nu_plugin_xpath.exe)
# run: strip target/release/nu_plugin_xpath.exe
- name: Create output directory
run: mkdir output
@ -274,7 +437,7 @@ jobs:
with:
name: windows-installer
path: ./
- name: Upload Windows installer
uses: actions/upload-release-asset@v1
env:

View File

@ -19,11 +19,10 @@ jobs:
operations-per-run: 520
enable-statistics: true
repo-token: ${{ secrets.GITHUB_TOKEN }}
stale-issue-message: 'This issue is being marked stale because it has been open for 90 days without activity. If you feel that this is in error, please comment below and we will keep it marked as active.'
stale-pr-message: 'This PR is being marked stale because it has been open for 45 days without activity. If this PR is still active, please comment below and we will keep it marked as active.'
close-issue-message: 'This issue has been marked stale for more than 10 days without activity. Closing this issue, but if you find that the issue is still valid, please reopen.'
close-pr-message: 'This PR has been marked stale for more than 10 days without activity. Closing this PR, but if you are still working on it, please reopen.'
close-issue-message: 'This issue has been marked stale for more than 100000 days without activity. Closing this issue, but if you find that the issue is still valid, please reopen.'
close-pr-message: 'This PR has been marked stale for more than 100 days without activity. Closing this PR, but if you are still working on it, please reopen.'
days-before-issue-stale: 90
days-before-pr-stale: 45
days-before-issue-close: 10
days-before-pr-close: 10
days-before-issue-close: 100000
days-before-pr-close: 100
exempt-issue-labels: 'exempt,keep'

18
.gitpod.Dockerfile vendored
View File

@ -1,18 +0,0 @@
FROM gitpod/workspace-full
# Gitpod will not rebuild Nushell's dev image unless *some* change is made to this Dockerfile.
# To force a rebuild, simply increase this counter:
ENV TRIGGER_REBUILD 1
USER gitpod
RUN sudo apt-get update && \
sudo apt-get install -y \
libssl-dev \
libxcb-composite0-dev \
pkg-config \
libpython3.6 \
rust-lldb \
&& sudo rm -rf /var/lib/apt/lists/*
ENV RUST_LLDB=/usr/bin/lldb-11

View File

@ -1,25 +0,0 @@
image:
file: .gitpod.Dockerfile
tasks:
- name: Clippy
init: cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
- name: Testing
init: cargo test --all --features=stable
- name: Build
init: cargo build --features=stable
- name: Nu
init: cargo install --path . --features=stable
command: nu
github:
prebuilds:
branches: true
pullRequestsFromForks: true
addLabel: prebuilt-in-gitpod
vscode:
extensions:
- hbenl.vscode-test-explorer@2.15.0:koqDUMWDPJzELp/hdS/lWw==
- Swellaby.vscode-rust-test-adapter@0.11.0:Xg+YeZZQiVpVUsIkH+uiiw==
- serayuzgur.crates@0.4.7:HMkoguLcXp9M3ud7ac3eIw==
- belfz.search-crates-io@1.2.1:kSLnyrOhXtYPjQpKnMr4eQ==
- bungcip.better-toml@0.3.2:3QfgGxxYtGHfJKQU7H0nEw==
- webfreak.debug@0.24.0:1zVcRsAhewYEX3/A9xjMNw==

View File

@ -1,14 +0,0 @@
{
"version": "0.2.0",
"configurations": [
{
"type": "gdb",
"request": "launch",
"name": "Debug Rust Code",
"preLaunchTask": "cargo",
"target": "${workspaceFolder}/target/debug/nu",
"cwd": "${workspaceFolder}",
"valuesFormatting": "parseText"
}
]
}

View File

@ -1,12 +0,0 @@
{
"tasks": [
{
"command": "cargo",
"args": [
"build"
],
"type": "process",
"label": "cargo",
}
],
}

3550
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -10,7 +10,7 @@ license = "MIT"
name = "nu"
readme = "README.md"
repository = "https://github.com/nushell/nushell"
version = "0.36.0"
version = "0.44.0"
[workspace]
members = ["crates/*/"]
@ -18,38 +18,34 @@ members = ["crates/*/"]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
nu-cli = { version = "0.36.0", path="./crates/nu-cli", default-features=false }
nu-command = { version = "0.36.0", path="./crates/nu-command" }
nu-completion = { version = "0.36.0", path="./crates/nu-completion" }
nu-data = { version = "0.36.0", path="./crates/nu-data" }
nu-engine = { version = "0.36.0", path="./crates/nu-engine" }
nu-errors = { version = "0.36.0", path="./crates/nu-errors" }
nu-parser = { version = "0.36.0", path="./crates/nu-parser" }
nu-path = { version = "0.36.0", path="./crates/nu-path" }
nu-plugin = { version = "0.36.0", path="./crates/nu-plugin" }
nu-protocol = { version = "0.36.0", path="./crates/nu-protocol" }
nu-source = { version = "0.36.0", path="./crates/nu-source" }
nu-value-ext = { version = "0.36.0", path="./crates/nu-value-ext" }
nu-cli = { version = "0.44.0", path="./crates/nu-cli", default-features=false }
nu-command = { version = "0.44.0", path="./crates/nu-command" }
nu-completion = { version = "0.44.0", path="./crates/nu-completion" }
nu-data = { version = "0.44.0", path="./crates/nu-data" }
nu-engine = { version = "0.44.0", path="./crates/nu-engine" }
nu-errors = { version = "0.44.0", path="./crates/nu-errors" }
nu-parser = { version = "0.44.0", path="./crates/nu-parser" }
nu-path = { version = "0.44.0", path="./crates/nu-path" }
nu-plugin = { version = "0.44.0", path="./crates/nu-plugin" }
nu-protocol = { version = "0.44.0", path="./crates/nu-protocol" }
nu-source = { version = "0.44.0", path="./crates/nu-source" }
nu-value-ext = { version = "0.44.0", path="./crates/nu-value-ext" }
nu_plugin_binaryview = { version = "0.36.0", path="./crates/nu_plugin_binaryview", optional=true }
nu_plugin_chart = { version = "0.36.0", path="./crates/nu_plugin_chart", optional=true }
nu_plugin_fetch = { version = "0.36.0", path="./crates/nu_plugin_fetch", optional=true }
nu_plugin_from_bson = { version = "0.36.0", path="./crates/nu_plugin_from_bson", optional=true }
nu_plugin_from_sqlite = { version = "0.36.0", path="./crates/nu_plugin_from_sqlite", optional=true }
nu_plugin_inc = { version = "0.36.0", path="./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.36.0", path="./crates/nu_plugin_match", optional=true }
nu_plugin_post = { version = "0.36.0", path="./crates/nu_plugin_post", optional=true }
nu_plugin_ps = { version = "0.36.0", path="./crates/nu_plugin_ps", optional=true }
nu_plugin_query_json = { version = "0.36.0", path="./crates/nu_plugin_query_json", optional=true }
nu_plugin_s3 = { version = "0.36.0", path="./crates/nu_plugin_s3", optional=true }
nu_plugin_selector = { version = "0.36.0", path="./crates/nu_plugin_selector", optional=true }
nu_plugin_start = { version = "0.36.0", path="./crates/nu_plugin_start", optional=true }
nu_plugin_sys = { version = "0.36.0", path="./crates/nu_plugin_sys", optional=true }
nu_plugin_textview = { version = "0.36.0", path="./crates/nu_plugin_textview", optional=true }
nu_plugin_to_bson = { version = "0.36.0", path="./crates/nu_plugin_to_bson", optional=true }
nu_plugin_to_sqlite = { version = "0.36.0", path="./crates/nu_plugin_to_sqlite", optional=true }
nu_plugin_tree = { version = "0.36.0", path="./crates/nu_plugin_tree", optional=true }
nu_plugin_xpath = { version = "0.36.0", path="./crates/nu_plugin_xpath", optional=true }
nu_plugin_binaryview = { version = "0.44.0", path="./crates/nu_plugin_binaryview", optional=true }
nu_plugin_chart = { version = "0.44.0", path="./crates/nu_plugin_chart", optional=true }
nu_plugin_from_bson = { version = "0.44.0", path="./crates/nu_plugin_from_bson", optional=true }
nu_plugin_from_sqlite = { version = "0.44.0", path="./crates/nu_plugin_from_sqlite", optional=true }
nu_plugin_inc = { version = "0.44.0", path="./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.44.0", path="./crates/nu_plugin_match", optional=true }
nu_plugin_query_json = { version = "0.44.0", path="./crates/nu_plugin_query_json", optional=true }
nu_plugin_s3 = { version = "0.44.0", path="./crates/nu_plugin_s3", optional=true }
nu_plugin_selector = { version = "0.44.0", path="./crates/nu_plugin_selector", optional=true }
nu_plugin_start = { version = "0.44.0", path="./crates/nu_plugin_start", optional=true }
nu_plugin_textview = { version = "0.44.0", path="./crates/nu_plugin_textview", optional=true }
nu_plugin_to_bson = { version = "0.44.0", path="./crates/nu_plugin_to_bson", optional=true }
nu_plugin_to_sqlite = { version = "0.44.0", path="./crates/nu_plugin_to_sqlite", optional=true }
nu_plugin_tree = { version = "0.44.0", path="./crates/nu_plugin_tree", optional=true }
nu_plugin_xpath = { version = "0.44.0", path="./crates/nu_plugin_xpath", optional=true }
# Required to bootstrap the main binary
ctrlc = { version="3.1.7", optional=true }
@ -57,8 +53,7 @@ futures = { version="0.3.12", features=["compat", "io-compat"] }
itertools = "0.10.0"
[dev-dependencies]
nu-test-support = { version = "0.36.0", path="./crates/nu-test-support" }
dunce = "1.0.1"
nu-test-support = { version = "0.44.0", path="./crates/nu-test-support" }
serial_test = "0.5.1"
hamcrest2 = "0.3.0"
rstest = "0.10.0"
@ -66,6 +61,8 @@ rstest = "0.10.0"
[build-dependencies]
[features]
fetch-support = ["nu-command/fetch", "nu-command/post"]
sys-support = ["nu-command/sys", "nu-command/ps"]
ctrlc-support = ["nu-cli/ctrlc", "nu-command/ctrlc"]
rustyline-support = ["nu-cli/rustyline-support", "nu-command/rustyline-support"]
term-support = ["nu-command/term"]
@ -74,15 +71,13 @@ which-support = ["nu-command/which", "nu-engine/which"]
default = [
"nu-cli/shadow-rs",
"sys",
"ps",
"sys-support",
"ctrlc-support",
"which-support",
"term-support",
"rustyline-support",
"match",
"post",
"fetch",
"fetch-support",
"zip-support",
"dataframe",
]
@ -94,7 +89,6 @@ extra = [
"inc",
"tree",
"textview",
"clipboard-cli",
"trash-support",
"uuid-support",
"start",
@ -110,19 +104,14 @@ extra = [
wasi = ["inc", "match", "match", "tree", "rustyline-support"]
# Stable (Default)
fetch = ["nu_plugin_fetch"]
inc = ["nu_plugin_inc"]
match = ["nu_plugin_match"]
post = ["nu_plugin_post"]
ps = ["nu_plugin_ps"]
sys = ["nu_plugin_sys"]
textview = ["nu_plugin_textview"]
# Extra
binaryview = ["nu_plugin_binaryview"]
bson = ["nu_plugin_from_bson", "nu_plugin_to_bson"]
chart = ["nu_plugin_chart"]
clipboard-cli = ["nu-command/clipboard-cli"]
query-json = ["nu_plugin_query_json"]
s3 = ["nu_plugin_s3"]
selector = ["nu_plugin_selector"]
@ -136,9 +125,6 @@ tree = ["nu_plugin_tree"]
xpath = ["nu_plugin_xpath"]
zip-support = ["nu-command/zip"]
#This is disabled in extra for now
table-pager = ["nu-command/table-pager"]
#dataframe feature for nushell
dataframe = [
"nu-engine/dataframe",
@ -146,12 +132,11 @@ dataframe = [
"nu-command/dataframe",
"nu-value-ext/dataframe",
"nu-data/dataframe",
"nu_plugin_post/dataframe",
"nu_plugin_to_bson/dataframe",
]
[profile.release]
opt-level = "z" # Optimize for size.
opt-level = "s" # Optimize for size.
# Core plugins that ship with `cargo install nu` by default
# Currently, Cargo limits us to installing only one binary
@ -166,31 +151,11 @@ name = "nu_plugin_core_inc"
path = "src/plugins/nu_plugin_core_inc.rs"
required-features = ["inc"]
[[bin]]
name = "nu_plugin_core_ps"
path = "src/plugins/nu_plugin_core_ps.rs"
required-features = ["ps"]
[[bin]]
name = "nu_plugin_core_sys"
path = "src/plugins/nu_plugin_core_sys.rs"
required-features = ["sys"]
[[bin]]
name = "nu_plugin_core_fetch"
path = "src/plugins/nu_plugin_core_fetch.rs"
required-features = ["fetch"]
[[bin]]
name = "nu_plugin_core_match"
path = "src/plugins/nu_plugin_core_match.rs"
required-features = ["match"]
[[bin]]
name = "nu_plugin_core_post"
path = "src/plugins/nu_plugin_core_post.rs"
required-features = ["post"]
# Extra plugins
[[bin]]

View File

@ -1,6 +1,6 @@
MIT License
Copyright (c) 2019 - 2021 Yehuda Katz, Jonathan Turner
Copyright (c) 2019 - 2021 Nushell Project
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

View File

@ -37,10 +37,6 @@ We also have an active [Discord](https://discord.gg/NtAbbGn) and [Twitter](https
You can also find information on more specific topics in our [cookbook](https://www.nushell.sh/cookbook/).
Try it in Gitpod.
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/nushell/nushell)
## Installation
### Local
@ -68,7 +64,7 @@ cargo install nu
To install Nu via the [Windows Package Manager](https://aka.ms/winget-cli):
```shell
winget install nu
winget install nushell
```
You can also build Nu yourself with all the bells and whistles (be sure to have installed the [dependencies](https://www.nushell.sh/book/installation.html#dependencies) for your platform), once you have checked out this repo with git:
@ -76,53 +72,6 @@ You can also build Nu yourself with all the bells and whistles (be sure to have
```shell
cargo build --workspace --features=extra
```
### Docker
#### Quickstart
Want to try Nu right away? Execute the following to get started.
```shell
docker run -it quay.io/nushell/nu:latest
```
#### Guide
If you want to pull a pre-built container, you can browse tags for the [nushell organization](https://quay.io/organization/nushell)
on Quay.io. Pulling a container would come down to:
```shell
docker pull quay.io/nushell/nu
docker pull quay.io/nushell/nu-base
```
Both "nu-base" and "nu" provide the nu binary, however, nu-base also includes the source code at `/code`
in the container and all dependencies.
Optionally, you can also build the containers locally using the [dockerfiles provided](docker):
To build the base image:
```shell
docker build -f docker/Dockerfile.nu-base -t nushell/nu-base .
```
And then to build the smaller container (using a Multistage build):
```shell
docker build -f docker/Dockerfile -t nushell/nu .
```
Either way, you can run either container as follows:
```shell
docker run -it nushell/nu-base
docker run -it nushell/nu
/> exit
```
The second container is a bit smaller if the size is important to you.
### Packaging status
[![Packaging status](https://repology.org/badge/vertical-allrepos/nushell.svg)](https://repology.org/project/nushell/versions)
@ -160,12 +109,11 @@ Commands are separated by the pipe symbol (`|`) to denote a pipeline flowing lef
0 │ assets │ Dir │ 128 B │ 5 months ago
1 │ crates │ Dir │ 704 B │ 50 mins ago
2 │ debian │ Dir │ 352 B │ 5 months ago
3 │ docker │ Dir │ 288 B │ 3 months ago
4docs │ Dir │ 192 B │ 50 mins ago
5images │ Dir │ 160 B │ 5 months ago
6src │ Dir │ 128 B │ 1 day ago
7 │ target │ Dir │ 160 B │ 5 days ago
8 │ tests │ Dir │ 192 B │ 3 months ago
3 │ docs │ Dir │ 192 B │ 50 mins ago
4images │ Dir │ 160 B │ 5 months ago
5src │ Dir │ 128 B │ 1 day ago
6target │ Dir │ 160 B │ 5 days ago
7 │ tests │ Dir │ 192 B │ 3 months ago
───┴────────┴──────┴───────┴──────────────
```

View File

@ -10,4 +10,4 @@ Foundational libraries are split into two kinds of crates:
Plugins are likewise also split into two types:
* Core plugins - plugins that provide part of the default experience of Nu, including access to the system properties, processes, and web-connectivity features.
* Extra plugins - these plugins run a wide range of differnt capabilities like working with different file types, charting, viewing binary data, and more.
* Extra plugins - these plugins run a wide range of different capabilities like working with different file types, charting, viewing binary data, and more.

View File

@ -9,7 +9,7 @@ description = "Library for ANSI terminal colors and styles (bold, underline)"
edition = "2018"
license = "MIT"
name = "nu-ansi-term"
version = "0.36.0"
version = "0.44.0"
[lib]
doctest = false
@ -21,7 +21,6 @@ derive_serde_style = ["serde"]
[dependencies]
overload = "0.1.1"
serde = { version="1.0.90", features=["derive"], optional=true }
itertools = "0.10.0"
# [dependencies.serde]
# version = "1.0.90"

View File

@ -93,7 +93,7 @@ pub static RESET: &str = "\x1B[0m";
impl Color {
fn write_foreground_code<W: AnyWrite + ?Sized>(&self, f: &mut W) -> Result<(), W::Error> {
match *self {
match self {
Color::Black => write!(f, "30"),
Color::Red => write!(f, "31"),
Color::Green => write!(f, "32"),
@ -103,8 +103,8 @@ impl Color {
Color::Magenta => write!(f, "35"),
Color::Cyan => write!(f, "36"),
Color::White => write!(f, "37"),
Color::Fixed(num) => write!(f, "38;5;{}", &num),
Color::Rgb(r, g, b) => write!(f, "38;2;{};{};{}", &r, &g, &b),
Color::Fixed(num) => write!(f, "38;5;{}", num),
Color::Rgb(r, g, b) => write!(f, "38;2;{};{};{}", r, g, b),
Color::DarkGray => write!(f, "90"),
Color::LightRed => write!(f, "91"),
Color::LightGreen => write!(f, "92"),
@ -118,7 +118,7 @@ impl Color {
}
fn write_background_code<W: AnyWrite + ?Sized>(&self, f: &mut W) -> Result<(), W::Error> {
match *self {
match self {
Color::Black => write!(f, "40"),
Color::Red => write!(f, "41"),
Color::Green => write!(f, "42"),
@ -128,8 +128,8 @@ impl Color {
Color::Magenta => write!(f, "45"),
Color::Cyan => write!(f, "46"),
Color::White => write!(f, "47"),
Color::Fixed(num) => write!(f, "48;5;{}", &num),
Color::Rgb(r, g, b) => write!(f, "48;2;{};{};{}", &r, &g, &b),
Color::Fixed(num) => write!(f, "48;5;{}", num),
Color::Rgb(r, g, b) => write!(f, "48;2;{};{};{}", r, g, b),
Color::DarkGray => write!(f, "100"),
Color::LightRed => write!(f, "101"),
Color::LightGreen => write!(f, "102"),

View File

@ -297,7 +297,7 @@ mod tests {
fn no_control_codes_for_plain() {
let one = Style::default().paint("one");
let two = Style::default().paint("two");
let output = format!("{}", AnsiStrings(&[one, two]));
assert_eq!(&*output, "onetwo");
let output = AnsiStrings(&[one, two]).to_string();
assert_eq!(output, "onetwo");
}
}

View File

@ -231,7 +231,6 @@
#![crate_name = "nu_ansi_term"]
#![crate_type = "rlib"]
#![crate_type = "dylib"]
#![warn(missing_copy_implementations)]
// #![warn(missing_docs)]
#![warn(trivial_casts, trivial_numeric_casts)]

View File

@ -602,18 +602,18 @@ mod serde_json_tests {
#[test]
fn color_deserialization() {
let colors = &[
let colors = [
Color::Red,
Color::Blue,
Color::Rgb(123, 123, 123),
Color::Fixed(255),
];
for color in colors.iter() {
for color in colors {
let serialized = serde_json::to_string(&color).unwrap();
let deserialized: Color = serde_json::from_str(&serialized).unwrap();
assert_eq!(color, &deserialized);
assert_eq!(color, deserialized);
}
}

View File

@ -75,6 +75,6 @@ mod test {
assert_eq!(unstyled_len(&a), 18);
let l2 = [Black.paint("st"), Red.paint("-second"), White.paint("-t")];
assert_eq!(sub_string(3, 11, &a).as_slice(), &l2);
assert_eq!(sub_string(3, 11, &a), l2);
}
}

View File

@ -4,23 +4,24 @@ description = "CLI for nushell"
edition = "2018"
license = "MIT"
name = "nu-cli"
version = "0.36.0"
version = "0.44.0"
build = "build.rs"
[lib]
doctest = false
[dependencies]
nu-completion = { version = "0.36.0", path="../nu-completion" }
nu-command = { version = "0.36.0", path="../nu-command" }
nu-data = { version = "0.36.0", path="../nu-data" }
nu-engine = { version = "0.36.0", path="../nu-engine" }
nu-errors = { version = "0.36.0", path="../nu-errors" }
nu-parser = { version = "0.36.0", path="../nu-parser" }
nu-protocol = { version = "0.36.0", path="../nu-protocol" }
nu-source = { version = "0.36.0", path="../nu-source" }
nu-stream = { version = "0.36.0", path="../nu-stream" }
nu-ansi-term = { version = "0.36.0", path="../nu-ansi-term" }
nu-completion = { version = "0.44.0", path="../nu-completion" }
nu-command = { version = "0.44.0", path="../nu-command" }
nu-data = { version = "0.44.0", path="../nu-data" }
nu-engine = { version = "0.44.0", path="../nu-engine" }
nu-errors = { version = "0.44.0", path="../nu-errors" }
nu-parser = { version = "0.44.0", path="../nu-parser" }
nu-protocol = { version = "0.44.0", path="../nu-protocol" }
nu-source = { version = "0.44.0", path="../nu-source" }
nu-stream = { version = "0.44.0", path="../nu-stream" }
nu-ansi-term = { version = "0.44.0", path="../nu-ansi-term" }
nu-path = { version = "0.44.0", path="../nu-path" }
indexmap ="1.6.1"
log = "0.4.14"
@ -28,13 +29,13 @@ pretty_env_logger = "0.4.0"
strip-ansi-escapes = "0.1.0"
rustyline = { version="9.0.0", optional=true }
ctrlc = { version="3.1.7", optional=true }
shadow-rs = { version="0.6", default-features=false, optional=true }
shadow-rs = { version = "0.8.1", default-features = false, optional = true }
serde = { version="1.0.123", features=["derive"] }
serde_yaml = "0.8.16"
lazy_static = "1.4.0"
[build-dependencies]
shadow-rs = "0.6"
shadow-rs = "0.8.1"
[features]
default = ["shadow-rs"]

View File

@ -508,14 +508,9 @@ mod tests {
#[test]
fn can_use_loglevels() -> Result<(), ShellError> {
for level in &["error", "warn", "info", "debug", "trace"] {
for level in ["error", "warn", "info", "debug", "trace"] {
let ui = cli_app();
let args = format!("nu --loglevel={}", *level);
ui.parse(&args)?;
assert_eq!(ui.loglevel().unwrap(), Ok(level.to_string()));
let ui = cli_app();
let args = format!("nu -l {}", *level);
let args = format!("nu --loglevel={}", level);
ui.parse(&args)?;
assert_eq!(ui.loglevel().unwrap(), Ok(level.to_string()));
}
@ -530,6 +525,17 @@ mod tests {
Ok(())
}
#[test]
fn can_be_login() -> Result<(), ShellError> {
let ui = cli_app();
ui.parse("nu -l")?;
let ui = cli_app();
ui.parse("nu --login")?;
Ok(())
}
#[test]
fn can_be_passed_nu_scripts() -> Result<(), ShellError> {
let ui = cli_app();
@ -541,11 +547,11 @@ mod tests {
#[test]
fn can_use_test_binaries() -> Result<(), ShellError> {
for binarie_name in &[
for binarie_name in [
"echo_env", "cococo", "iecho", "fail", "nonu", "chop", "repeater", "meow",
] {
let ui = cli_app();
let args = format!("nu --testbin={}", *binarie_name);
let args = format!("nu --testbin={}", binarie_name);
ui.parse(&args)?;
assert_eq!(ui.testbin().unwrap(), Ok(binarie_name.to_string()));
}

View File

@ -44,7 +44,7 @@ impl Options {
}
pub fn get(&self, key: &str) -> Option<Value> {
self.inner.borrow().get(key).map(Clone::clone)
self.inner.borrow().get(key).cloned()
}
pub fn put(&self, key: &str, value: Value) {

View File

@ -67,48 +67,37 @@ impl OptionsParser for NuParser {
}
};
let value =
value
.map(|v| match k.as_ref() {
"testbin" => {
if let Ok(name) = v.as_string() {
if testbins().iter().any(|n| name == *n) {
Some(v)
} else {
Some(
UntaggedValue::Error(
ShellError::untagged_runtime_error(
format!("{} is not supported.", name),
),
)
.into_value(v.tag),
)
}
} else {
Some(v)
}
let value = value.map(|v| match k.as_ref() {
"testbin" => {
if let Ok(name) = v.as_string() {
if testbins().iter().any(|n| name == *n) {
v
} else {
UntaggedValue::Error(ShellError::untagged_runtime_error(
format!("{} is not supported.", name),
))
.into_value(v.tag)
}
"loglevel" => {
if let Ok(name) = v.as_string() {
if loglevels().iter().any(|n| name == *n) {
Some(v)
} else {
Some(
UntaggedValue::Error(
ShellError::untagged_runtime_error(
format!("{} is not supported.", name),
),
)
.into_value(v.tag),
)
}
} else {
Some(v)
}
} else {
v
}
}
"loglevel" => {
if let Ok(name) = v.as_string() {
if loglevels().iter().any(|n| name == *n) {
v
} else {
UntaggedValue::Error(ShellError::untagged_runtime_error(
format!("{} is not supported.", name),
))
.into_value(v.tag)
}
_ => Some(v),
})
.flatten();
} else {
v
}
}
_ => v,
});
if let Some(value) = value {
options.put(k, value);

View File

@ -24,6 +24,7 @@ use rustyline::{self, error::ReadlineError};
use nu_errors::ShellError;
use nu_parser::ParserScope;
use nu_path::expand_tilde;
use nu_protocol::{hir::ExternalRedirection, ConfigPath, UntaggedValue, Value};
use log::trace;
@ -33,7 +34,7 @@ use std::path::PathBuf;
// Name of environment variable where the prompt could be stored
#[cfg(feature = "rustyline-support")]
const PROMPT_STRING: &str = "PROMPT_STRING";
const PROMPT_COMMAND: &str = "PROMPT_COMMAND";
pub fn search_paths() -> Vec<std::path::PathBuf> {
use std::env;
@ -54,7 +55,7 @@ pub fn search_paths() -> Vec<std::path::PathBuf> {
{
for pipeline in pipelines {
if let Ok(plugin_dir) = pipeline.as_string() {
search_paths.push(PathBuf::from(plugin_dir));
search_paths.push(expand_tilde(plugin_dir));
}
}
}
@ -90,10 +91,10 @@ pub fn run_script_file(
fn default_prompt_string(cwd: &str) -> String {
format!(
"{}{}{}{}{}{}> ",
Color::Green.bold().prefix().to_string(),
Color::Green.bold().prefix(),
cwd,
nu_ansi_term::ansi::RESET,
Color::Cyan.bold().prefix().to_string(),
Color::Cyan.bold().prefix(),
current_branch(),
nu_ansi_term::ansi::RESET
)
@ -165,7 +166,10 @@ pub fn cli(
// Store cmd duration in an env var
context.scope.add_env_var(
"CMD_DURATION_MS",
format!("{}", startup_commands_start_time.elapsed().as_millis()),
startup_commands_start_time
.elapsed()
.as_millis()
.to_string(),
);
if options.perf {
@ -298,9 +302,9 @@ pub fn cli(
let cwd = context.shell_manager().path();
// Check if the PROMPT_STRING env variable is set. This env variable
// Check if the PROMPT_COMMAND env variable is set. This env variable
// contains nu code that is used to overwrite the prompt
let colored_prompt = match context.scope.get_env(PROMPT_STRING) {
let colored_prompt = match context.scope.get_env(PROMPT_COMMAND) {
Some(env_prompt) => evaluate_prompt_string(&env_prompt, &context, &cwd),
None => {
if let Some(prompt) = &prompt {
@ -353,7 +357,7 @@ pub fn cli(
// Store cmd duration in an env var
context.scope.add_env_var(
"CMD_DURATION_MS",
format!("{}", cmd_start_time.elapsed().as_millis()),
cmd_start_time.elapsed().as_millis().to_string(),
);
match line {
@ -368,7 +372,7 @@ pub fn cli(
LineResult::ClearHistory => {
if options.save_history {
rl.clear_history();
let _ = rl.append_history(&history_path);
std::fs::remove_file(&history_path)?;
}
}
@ -397,8 +401,7 @@ pub fn cli(
.lock()
.global_config
.as_ref()
.map(|cfg| cfg.var("ctrlc_exit"))
.flatten()
.and_then(|cfg| cfg.var("ctrlc_exit"))
.map(|ctrl_c| ctrl_c.is_true())
.unwrap_or(false); // default behavior is to allow CTRL-C spamming similar to other shells

View File

@ -461,7 +461,7 @@ pub(crate) fn load_keybindings(
if let Ok(contents) = contents {
let keybindings: Keybindings = serde_yaml::from_str(&contents)?;
// eprintln!("{:#?}", keybindings);
for keybinding in keybindings.into_iter() {
for keybinding in keybindings {
let (k, b) = convert_keybinding(keybinding);
// eprintln!("{:?} {:?}", k, b);

View File

@ -1,57 +1,51 @@
[package]
authors = ["The Nu Project Contributors"]
build = "build.rs"
description = "CLI for nushell"
description = "Commands for Nushell"
edition = "2018"
license = "MIT"
name = "nu-command"
version = "0.36.0"
version = "0.44.0"
[lib]
doctest = false
[dependencies]
nu-data = { version = "0.36.0", path="../nu-data" }
nu-engine = { version = "0.36.0", path="../nu-engine" }
nu-errors = { version = "0.36.0", path="../nu-errors" }
nu-json = { version = "0.36.0", path="../nu-json" }
nu-path = { version = "0.36.0", path="../nu-path" }
nu-parser = { version = "0.36.0", path="../nu-parser" }
nu-plugin = { version = "0.36.0", path="../nu-plugin" }
nu-protocol = { version = "0.36.0", path="../nu-protocol" }
nu-serde = { version = "0.36.0", path="../nu-serde" }
nu-source = { version = "0.36.0", path="../nu-source" }
nu-stream = { version = "0.36.0", path="../nu-stream" }
nu-table = { version = "0.36.0", path="../nu-table" }
nu-test-support = { version = "0.36.0", path="../nu-test-support" }
nu-value-ext = { version = "0.36.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.36.0", path="../nu-ansi-term" }
nu-pretty-hex = { version = "0.36.0", path="../nu-pretty-hex" }
nu-data = { version = "0.44.0", path="../nu-data" }
nu-engine = { version = "0.44.0", path="../nu-engine" }
nu-errors = { version = "0.44.0", path="../nu-errors" }
nu-json = { version = "0.44.0", path="../nu-json" }
nu-path = { version = "0.44.0", path="../nu-path" }
nu-parser = { version = "0.44.0", path="../nu-parser" }
nu-plugin = { version = "0.44.0", path="../nu-plugin" }
nu-protocol = { version = "0.44.0", path="../nu-protocol" }
nu-serde = { version = "0.44.0", path="../nu-serde" }
nu-source = { version = "0.44.0", path="../nu-source" }
nu-stream = { version = "0.44.0", path="../nu-stream" }
nu-table = { version = "0.44.0", path="../nu-table" }
nu-test-support = { version = "0.44.0", path="../nu-test-support" }
nu-value-ext = { version = "0.44.0", path="../nu-value-ext" }
nu-ansi-term = { version = "0.44.0", path="../nu-ansi-term" }
nu-pretty-hex = { version = "0.44.0", path="../nu-pretty-hex" }
Inflector = "0.11"
arboard = { version="1.1.0", optional=true }
url = "2.2.1"
mime = "0.3.16"
heck = "0.4.0"
base64 = "0.13.0"
bigdecimal = { package = "bigdecimal-rs", version = "0.2.1", features = ["serde"] }
byte-unit = "4.0.9"
bytes = "1.0.1"
bigdecimal = { version = "0.3.0", features = ["serde"] }
calamine = "0.18.0"
chrono = { version="0.4.19", features=["serde"] }
chrono-tz = "0.5.3"
codespan-reporting = "0.11.0"
crossterm = { version="0.19.0", optional=true }
csv = "1.1.3"
ctrlc = { version="3.1.7", optional=true }
derive-new = "0.5.8"
directories-next = "2.0.0"
dirs-next = "2.0.0"
dtparse = "1.2.0"
dunce = "1.0.1"
eml-parser = "0.1.0"
encoding_rs = "0.8.28"
filesize = "0.2.0"
fs_extra = "1.2.0"
futures = { version="0.3.12", features=["compat", "io-compat"] }
getset = "0.1.1"
glob = "0.3.0"
htmlescape = "0.3.1"
ical = "0.7.0"
@ -61,49 +55,43 @@ lazy_static = "1.*"
log = "0.4.14"
md-5 = "0.9.1"
meval = "0.2.0"
minus = { version="3.4.0", optional=true, features=["async_std_lib", "search"] }
num-bigint = { version="0.3.1", features=["serde"] }
num-bigint = { version="0.4.3", features=["serde"] }
num-format = { version="0.4.0", features=["with-num-bigint"] }
num-traits = "0.2.14"
parking_lot = "0.11.1"
pin-utils = "0.1.0"
query_interface = "0.3.5"
quick-xml = "0.22"
rand = "0.8"
rayon = "1.5.0"
regex = "1.4.3"
reqwest = {version = "0.11", optional = true }
roxmltree = "0.14.0"
rust-embed = "5.9.0"
rustyline = { version="9.0.0", optional=true }
serde = { version="1.0.123", features=["derive"] }
serde_bytes = "0.11.5"
serde_ini = "0.2.0"
serde_json = "1.0.61"
serde_urlencoded = "0.7.0"
serde_yaml = "0.8.16"
sha2 = "0.9.3"
strip-ansi-escapes = "0.1.0"
sxd-document = "0.3.2"
sxd-xpath = "0.4.2"
sysinfo = { version = "0.23.0", optional = true }
thiserror = "1.0.26"
tempfile = "3.2.0"
term = { version="0.7.0", optional=true }
term_size = "0.3.2"
termcolor = "1.1.2"
titlecase = "1.1.0"
tokio = { version = "1", features = ["rt-multi-thread"], optional = true }
toml = "0.5.8"
trash = { version="1.3.0", optional=true }
trash = { version = "2.0.2", optional = true }
unicode-segmentation = "1.8"
url = "2.2.0"
uuid_crate = { package="uuid", version="0.8.2", features=["v4"], optional=true }
which = { version="4.1.0", optional=true }
zip = { version="0.5.9", optional=true }
digest = "0.9.0"
[dependencies.polars]
version = "0.15.1"
version = "0.17.0"
optional = true
features = ["parquet", "json", "random", "pivot", "strings", "is_in", "temporal"]
default-features = false
features = ["docs", "zip_with", "csv-file", "temporal", "performant", "pretty_fmt", "dtype-slim", "parquet", "json", "random", "pivot", "strings", "is_in", "cum_agg", "rolling_window"]
[target.'cfg(unix)'.dependencies]
umask = "1.0.0"
@ -112,16 +100,11 @@ users = "0.11.0"
# TODO this will be possible with new dependency resolver
# (currently on nightly behind -Zfeatures=itarget):
# https://github.com/rust-lang/cargo/issues/7914
#[target.'cfg(not(windows))'.dependencies]
#num-format = {version = "0.4", features = ["with-system-locale"]}
[dependencies.rusqlite]
features = ["bundled", "blob"]
optional = true
version = "0.25.3"
# [target.'cfg(not(windows))'.dependencies]
# num-format = { version = "0.4", features = ["with-system-locale"] }
[build-dependencies]
shadow-rs = "0.6"
shadow-rs = "0.8.1"
[dev-dependencies]
quickcheck = "1.0.3"
@ -129,9 +112,11 @@ quickcheck_macros = "1.0.0"
hamcrest2 = "0.3.0"
[features]
clipboard-cli = ["arboard"]
rustyline-support = ["rustyline"]
stable = []
trash-support = ["trash"]
table-pager = ["minus", "crossterm"]
dataframe = ["nu-protocol/dataframe", "polars"]
fetch = ["reqwest", "tokio"]
post = ["reqwest", "tokio"]
sys = ["sysinfo"]
ps = ["sysinfo"]

View File

@ -1,7 +0,0 @@
use derive_new::new;
use nu_protocol::hir;
#[derive(new, Debug)]
pub(crate) struct Command {
pub(crate) args: hir::Call,
}

View File

@ -1,10 +1,15 @@
use crate::prelude::*;
use lazy_static::lazy_static;
use nu_engine::{evaluate_baseline_expr, BufCodecReader};
use nu_engine::{MaybeTextCodec, StringOrBinary};
use nu_test_support::NATIVE_PATH_ENV_VAR;
use parking_lot::Mutex;
use regex::Regex;
#[allow(unused)]
use std::env;
use std::io::Write;
use std::path::PathBuf;
use std::process::{Command, Stdio};
use std::sync::mpsc;
use std::{borrow::Cow, io::BufReader};
@ -17,6 +22,9 @@ use nu_protocol::hir::{ExternalCommand, ExternalRedirection};
use nu_protocol::{Primitive, ShellTypeName, UntaggedValue, Value};
use nu_source::Tag;
#[cfg(feature = "which")]
use which::which_in;
pub(crate) fn run_external_command(
command: ExternalCommand,
context: &mut EvaluationContext,
@ -38,20 +46,16 @@ pub(crate) fn run_external_command(
}
#[allow(unused)]
fn trim_double_quotes(input: &str) -> String {
fn trim_enclosing_quotes(input: &str) -> String {
let mut chars = input.chars();
match (chars.next(), chars.next_back()) {
(Some('"'), Some('"')) => chars.collect(),
(Some('\''), Some('\'')) => chars.collect(),
_ => input.to_string(),
}
}
#[allow(unused)]
fn escape_where_needed(input: &str) -> String {
input.split(' ').join("\\ ").split('\'').join("\\'")
}
fn run_with_stdin(
command: ExternalCommand,
context: &mut EvaluationContext,
@ -104,20 +108,14 @@ fn run_with_stdin(
let process_args = command_args
.iter()
.map(|(arg, _is_literal)| {
let arg = nu_path::expand_tilde_string(Cow::Borrowed(arg));
let arg = nu_path::expand_tilde(arg).to_string_lossy().to_string();
#[cfg(not(windows))]
{
if !_is_literal {
let escaped = escape_double_quotes(&arg);
add_double_quotes(&escaped)
arg
} else {
let trimmed = trim_double_quotes(&arg);
if trimmed != arg {
escape_where_needed(&trimmed)
} else {
trimmed
}
trim_enclosing_quotes(&arg)
}
}
#[cfg(windows)]
@ -125,7 +123,7 @@ fn run_with_stdin(
if let Some(unquoted) = remove_quotes(&arg) {
unquoted.to_string()
} else {
arg.to_string()
arg
}
}
})
@ -141,6 +139,99 @@ fn run_with_stdin(
)
}
/// Spawn a direct exe
#[allow(unused)]
fn spawn_exe(full_path: PathBuf, args: &[String]) -> Command {
let mut process = Command::new(full_path);
for arg in args {
process.arg(&arg);
}
process
}
/// Spawn a cmd command with `cmd /c args...`
fn spawn_cmd_command(command: &ExternalCommand, args: &[String]) -> Command {
let mut process = Command::new("cmd");
process.arg("/c");
process.arg(&command.name);
for arg in args {
// Clean the args before we use them:
// https://stackoverflow.com/questions/1200235/how-to-pass-a-quoted-pipe-character-to-cmd-exe
// cmd.exe needs to have a caret to escape a pipe
let arg = arg.replace("|", "^|");
process.arg(&arg);
}
process
}
fn has_unsafe_shell_characters(arg: &str) -> bool {
lazy_static! {
static ref RE: Regex = Regex::new(r"[^\w@%+=:,./-]").expect("regex to be valid");
}
RE.is_match(arg)
}
fn shell_arg_escape(arg: &str) -> String {
match arg {
"" => String::from("''"),
s if !has_unsafe_shell_characters(s) => String::from(s),
_ => {
let single_quotes_escaped = arg.split('\'').join("'\"'\"'");
format!("'{}'", single_quotes_escaped)
}
}
}
/// Spawn a sh command with `sh -c args...`
fn spawn_sh_command(command: &ExternalCommand, args: &[String]) -> Command {
let joined_and_escaped_arguments = args.iter().map(|arg| shell_arg_escape(arg)).join(" ");
let cmd_with_args = vec![command.name.clone(), joined_and_escaped_arguments].join(" ");
let mut process = Command::new("sh");
process.arg("-c").arg(cmd_with_args);
process
}
/// a function to spawn any external command
#[allow(unused)] // for minimal builds cwd is unused
fn spawn_any(command: &ExternalCommand, args: &[String], cwd: &str) -> Command {
// resolve the executable name if it is spawnable directly
#[cfg(feature = "which")]
// TODO add more available paths to `env::var_os("PATH")`?
if let Result::Ok(full_path) = which_in(&command.name, env::var_os("PATH"), cwd) {
if let Some(extension) = full_path.extension() {
#[cfg(windows)]
if extension.eq_ignore_ascii_case("exe") {
// if exe spawn it directly
return spawn_exe(full_path, args);
} else {
// TODO implement special care for various executable types such as .bat, .ps1, .cmd, etc
// https://github.com/mklement0/Native/blob/e0e0b8785cad39a73053e35084d1f60d87fbac58/Native.psm1#L749
// otherwise shell out to cmd
return spawn_cmd_command(command, args);
}
#[cfg(not(windows))]
if !["sh", "bash"]
.iter()
.any(|ext| extension.eq_ignore_ascii_case(ext))
{
// if exe spawn it directly
return spawn_exe(full_path, args);
} else {
// otherwise shell out to sh
return spawn_sh_command(command, args);
}
}
}
// in all the other cases shell out
if cfg!(windows) {
spawn_cmd_command(command, args)
} else {
// TODO what happens if that os doesn't support spawning sh?
spawn_sh_command(command, args)
}
}
fn spawn(
command: &ExternalCommand,
path: &str,
@ -151,31 +242,7 @@ fn spawn(
) -> Result<InputStream, ShellError> {
let command = command.clone();
let mut process = {
#[cfg(windows)]
{
let mut process = Command::new("cmd");
process.arg("/c");
process.arg(&command.name);
for arg in args {
// Clean the args before we use them:
// https://stackoverflow.com/questions/1200235/how-to-pass-a-quoted-pipe-character-to-cmd-exe
// cmd.exe needs to have a caret to escape a pipe
let arg = arg.replace("|", "^|");
process.arg(&arg);
}
process
}
#[cfg(not(windows))]
{
let cmd_with_args = vec![command.name.clone(), args.join(" ")].join(" ");
let mut process = Command::new("sh");
process.arg("-c").arg(cmd_with_args);
process
}
};
let mut process = spawn_any(&command, args, path);
process.current_dir(path);
trace!(target: "nu::run::external", "cwd = {:?}", &path);
@ -454,7 +521,7 @@ fn spawn(
Ok(stream.into_input_stream())
}
Err(e) => Err(ShellError::labeled_error(
format!("{}", e),
e.to_string(),
"failed to spawn",
&command.name_tag,
)),

View File

@ -1,5 +1 @@
mod dynamic;
pub(crate) mod external;
#[allow(unused_imports)]
pub(crate) use dynamic::Command as DynamicCommand;

View File

@ -22,6 +22,7 @@ impl WholeStreamCommand for Histogram {
None,
)
.rest(
"rest",
SyntaxShape::ColumnPath,
"column name to give the histogram's frequency column",
)

View File

@ -38,7 +38,7 @@ impl WholeStreamCommand for SubCommand {
},
Example {
description: "Set coloring options",
example: "config set color_config [[header_align header_bold]; [left $true]]",
example: "config set color_config [[header_align header_color]; [left white_bold]]",
result: None,
},
Example {

View File

@ -13,6 +13,7 @@ impl WholeStreamCommand for SubCommand {
fn signature(&self) -> Signature {
Signature::build("into binary").rest(
"rest",
SyntaxShape::ColumnPath,
"column paths to convert to binary (for table input)",
)

View File

@ -0,0 +1,118 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{ColumnPath, Primitive, Signature, SyntaxShape, UntaggedValue, Value};
pub struct SubCommand;
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"into column-path"
}
fn signature(&self) -> Signature {
Signature::build("into column-path").rest(
"rest",
SyntaxShape::ColumnPath,
"values to convert to column path",
)
}
fn usage(&self) -> &str {
"Convert value to column path"
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
into_filepath(args)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Convert string to column path in table",
example: "echo [[name]; ['/dev/null'] ['C:\\Program Files'] ['../../Cargo.toml']] | into column-path name",
result: Some(vec![
UntaggedValue::row(indexmap! {
"name".to_string() => UntaggedValue::column_path("/dev/null", Span::unknown()).into(),
})
.into(),
UntaggedValue::row(indexmap! {
"name".to_string() => UntaggedValue::column_path("C:\\Program Files", Span::unknown()).into(),
})
.into(),
UntaggedValue::row(indexmap! {
"name".to_string() => UntaggedValue::column_path("../../Cargo.toml", Span::unknown()).into(),
})
.into(),
]),
},
Example {
description: "Convert string to column path",
example: "echo 'Cargo.toml' | into column-path",
result: Some(vec![UntaggedValue::column_path("Cargo.toml", Span::unknown()).into()]),
},
]
}
}
fn into_filepath(args: CommandArgs) -> Result<OutputStream, ShellError> {
let column_paths: Vec<ColumnPath> = args.rest(0)?;
Ok(args
.input
.map(move |v| {
if column_paths.is_empty() {
action(&v, v.tag())
} else {
let mut ret = v;
for path in &column_paths {
ret = ret.swap_data_by_column_path(
path,
Box::new(move |old| action(old, old.tag())),
)?;
}
Ok(ret)
}
})
.into_input_stream())
}
pub fn action(input: &Value, tag: impl Into<Tag>) -> Result<Value, ShellError> {
let tag = tag.into();
match &input.value {
UntaggedValue::Primitive(prim) => Ok(UntaggedValue::column_path(
match prim {
Primitive::String(a_string) => a_string,
_ => {
return Err(ShellError::unimplemented(
"'into column-path' for non-string primitives",
))
}
},
Span::unknown(),
)
.into_value(&tag)),
UntaggedValue::Row(_) => Err(ShellError::labeled_error(
"specify column name to use, with 'into column-path COLUMN'",
"found table",
tag,
)),
_ => Err(ShellError::unimplemented(
"'into column-path' for unsupported type",
)),
}
}
#[cfg(test)]
mod tests {
use super::ShellError;
use super::SubCommand;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(SubCommand {})
}
}

View File

@ -14,6 +14,7 @@ impl WholeStreamCommand for SubCommand {
fn signature(&self) -> Signature {
Signature::build("into path").rest(
"rest",
SyntaxShape::ColumnPath,
"column paths to convert to filepath (for table input)",
)

View File

@ -0,0 +1,182 @@
use std::convert::TryInto;
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{ColumnPath, Primitive, Signature, SyntaxShape, UntaggedValue, Value};
use num_bigint::ToBigInt;
pub struct SubCommand;
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"into filesize"
}
fn signature(&self) -> Signature {
Signature::build("into filesize").rest(
"rest",
SyntaxShape::ColumnPath,
"column paths to convert to filesize (for table input)",
)
}
fn usage(&self) -> &str {
"Convert value to filesize"
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
into_filesize(args)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Convert string to filesize in table",
example: "echo [[bytes]; ['5'] [3.2] [4] [2kb]] | into filesize bytes",
result: Some(vec![
UntaggedValue::row(indexmap! {
"bytes".to_string() => UntaggedValue::filesize(5).into(),
})
.into(),
UntaggedValue::row(indexmap! {
"bytes".to_string() => UntaggedValue::filesize(3).into(),
})
.into(),
UntaggedValue::row(indexmap! {
"bytes".to_string() => UntaggedValue::filesize(4).into(),
})
.into(),
UntaggedValue::row(indexmap! {
"bytes".to_string() => UntaggedValue::filesize(2000).into(),
})
.into(),
]),
},
Example {
description: "Convert string to filesize",
example: "echo '2' | into filesize",
result: Some(vec![UntaggedValue::filesize(2).into()]),
},
Example {
description: "Convert decimal to filesize",
example: "echo 8.3 | into filesize",
result: Some(vec![UntaggedValue::filesize(8).into()]),
},
Example {
description: "Convert int to filesize",
example: "echo 5 | into filesize",
result: Some(vec![UntaggedValue::filesize(5).into()]),
},
Example {
description: "Convert file size to filesize",
example: "echo 4KB | into filesize",
result: Some(vec![UntaggedValue::filesize(4000).into()]),
},
]
}
}
fn into_filesize(args: CommandArgs) -> Result<OutputStream, ShellError> {
let column_paths: Vec<ColumnPath> = args.rest(0)?;
Ok(args
.input
.map(move |v| {
if column_paths.is_empty() {
action(&v, v.tag())
} else {
let mut ret = v;
for path in &column_paths {
ret = ret.swap_data_by_column_path(
path,
Box::new(move |old| action(old, old.tag())),
)?;
}
Ok(ret)
}
})
.into_input_stream())
}
pub fn action(input: &Value, tag: impl Into<Tag>) -> Result<Value, ShellError> {
let tag = tag.into();
match &input.value {
UntaggedValue::Primitive(prim) => Ok(UntaggedValue::filesize(match prim {
Primitive::String(a_string) => match int_from_string(a_string.trim(), &tag) {
Ok(n) => n,
Err(e) => {
return Err(e);
}
},
Primitive::Decimal(dec) => match dec.to_bigint() {
Some(n) => match n.to_u64() {
Some(i) => i,
None => {
return Err(ShellError::unimplemented(
"failed to convert decimal to filesize",
));
}
},
None => {
return Err(ShellError::unimplemented(
"failed to convert decimal to filesize",
));
}
},
Primitive::Int(n_ref) => (*n_ref).try_into().map_err(|_| {
ShellError::unimplemented("cannot convert negative integer to filesize")
})?,
Primitive::Filesize(a_filesize) => *a_filesize,
_ => {
return Err(ShellError::unimplemented(
"'into filesize' for non-numeric primitives",
))
}
})
.into_value(&tag)),
UntaggedValue::Row(_) => Err(ShellError::labeled_error(
"specify column name to use, with 'into filesize COLUMN'",
"found table",
tag,
)),
_ => Err(ShellError::unimplemented(
"'into filesize' for unsupported type",
)),
}
}
fn int_from_string(a_string: &str, tag: &Tag) -> Result<u64, ShellError> {
match a_string.parse::<u64>() {
Ok(n) => Ok(n),
Err(_) => match a_string.parse::<f64>() {
Ok(f) => match f.to_u64() {
Some(i) => Ok(i),
None => Err(ShellError::labeled_error(
"Could not convert string value to filesize",
"original value",
tag.clone(),
)),
},
Err(_) => Err(ShellError::labeled_error(
"Could not convert string value to filesize",
"original value",
tag.clone(),
)),
},
}
}
#[cfg(test)]
mod tests {
use super::ShellError;
use super::SubCommand;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(SubCommand {})
}
}

View File

@ -13,6 +13,7 @@ impl WholeStreamCommand for SubCommand {
fn signature(&self) -> Signature {
Signature::build("into int").rest(
"rest",
SyntaxShape::ColumnPath,
"column paths to convert to int (for table input)",
)

View File

@ -1,10 +1,14 @@
mod binary;
mod column_path;
mod command;
mod filepath;
mod filesize;
mod int;
pub mod string;
pub use self::filesize::SubCommand as IntoFilesize;
pub use binary::SubCommand as IntoBinary;
pub use column_path::SubCommand as IntoColumnPath;
pub use command::Command as Into;
pub use filepath::SubCommand as IntoFilepath;
pub use int::SubCommand as IntoInt;

View File

@ -20,6 +20,7 @@ impl WholeStreamCommand for SubCommand {
fn signature(&self) -> Signature {
Signature::build("into string")
.rest(
"rest",
SyntaxShape::ColumnPath,
"column paths to convert to string (for table input)",
)
@ -158,7 +159,7 @@ pub fn action(
}
fn format_int(int: i64) -> String {
format!("{}", int)
int.to_string()
// TODO once platform-specific dependencies are stable (see Cargo.toml)
// #[cfg(windows)]
@ -175,7 +176,7 @@ fn format_int(int: i64) -> String {
}
fn format_bigint(int: &BigInt) -> String {
format!("{}", int)
int.to_string()
// TODO once platform-specific dependencies are stable (see Cargo.toml)
// #[cfg(windows)]
@ -229,7 +230,7 @@ fn format_decimal(mut decimal: BigDecimal, digits: Option<u64>, group_digits: bo
let format_default_loc = |int_part: BigInt| {
let loc = Locale::en;
//TODO: when num_format is available for recent bigint, replace this with the locale-based format
let (int_str, sep) = (format!("{}", int_part), String::from(loc.decimal()));
let (int_str, sep) = (int_part.to_string(), String::from(loc.decimal()));
format!("{}{}{}", int_str, sep, dec_str)
};

View File

@ -15,7 +15,7 @@ impl WholeStreamCommand for Alias {
Signature::build("alias")
.required("name", SyntaxShape::String, "the name of the alias")
.required("equals", SyntaxShape::String, "the equals sign")
.rest(SyntaxShape::Any, "the expansion for the alias")
.rest("rest", SyntaxShape::Any, "the expansion for the alias")
}
fn usage(&self) -> &str {

View File

@ -27,7 +27,7 @@ impl WholeStreamCommand for Do {
"ignore errors as the block runs",
Some('i'),
)
.rest(SyntaxShape::Any, "the parameter(s) for the block")
.rest("rest", SyntaxShape::Any, "the parameter(s) for the block")
}
fn usage(&self) -> &str {

View File

@ -12,7 +12,7 @@ impl WholeStreamCommand for Echo {
}
fn signature(&self) -> Signature {
Signature::build("echo").rest(SyntaxShape::Any, "the values to echo")
Signature::build("echo").rest("rest", SyntaxShape::Any, "the values to echo")
}
fn usage(&self) -> &str {

View File

@ -0,0 +1,109 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Primitive, Signature, UntaggedValue, Value};
pub struct SubCommand;
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"error make"
}
fn signature(&self) -> Signature {
Signature::build("error make")
}
fn usage(&self) -> &str {
"Create an error."
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
let input = args.input;
Ok(input
.map(|value| {
make_error(&value)
.map(|err| UntaggedValue::Error(err).into_value(value.tag()))
.unwrap_or_else(|| {
UntaggedValue::Error(ShellError::untagged_runtime_error(
"Creating error value not supported.",
))
.into_value(value.tag())
})
})
.into_output_stream())
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Creates a labeled error",
example: r#"[
[ msg, labels, span];
["The message", "Helpful message here", ([[start, end]; [0, 141]])]
] | error make"#,
result: None,
}]
}
}
fn make_error(value: &Value) -> Option<ShellError> {
if let Value {
value: UntaggedValue::Row(dict),
..
} = value
{
let msg = dict.get_data_by_key("msg".spanned_unknown());
let labels =
dict.get_data_by_key("labels".spanned_unknown())
.and_then(|table| match &table.value {
UntaggedValue::Table(_) => table
.table_entries()
.map(|value| value.as_string().ok())
.collect(),
UntaggedValue::Primitive(Primitive::String(label)) => {
Some(vec![label.to_string()])
}
_ => None,
});
let _anchor = dict.get_data_by_key("tag".spanned_unknown());
let span = dict.get_data_by_key("span".spanned_unknown());
if msg.is_none() || labels.is_none() || span.is_none() {
return None;
}
let msg = msg.and_then(|msg| msg.as_string().ok());
if let Some(labels) = labels {
if labels.is_empty() {
return None;
}
return Some(ShellError::labeled_error(
msg.expect("Message will always be present."),
&labels[0],
span.map(|data| match data {
Value {
value: UntaggedValue::Row(vals),
..
} => match (vals.entries.get("start"), vals.entries.get("end")) {
(Some(start), Some(end)) => {
let start = start.as_usize().ok().unwrap_or(0);
let end = end.as_usize().ok().unwrap_or(0);
Span::new(start, end)
}
(_, _) => Span::unknown(),
},
_ => Span::unknown(),
})
.unwrap_or_else(Span::unknown),
));
}
}
None
}

View File

@ -0,0 +1,3 @@
mod make;
pub use make::SubCommand as ErrorMake;

View File

@ -0,0 +1,110 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Dictionary, Signature, SyntaxShape, UntaggedValue, Value};
pub struct Find;
impl WholeStreamCommand for Find {
fn name(&self) -> &str {
"find"
}
fn signature(&self) -> Signature {
Signature::build("find").rest("rest", SyntaxShape::String, "search term")
}
fn usage(&self) -> &str {
"Find text in the output of a previous command"
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
find(args)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Search pipeline output for multiple terms",
example: r#"ls | find toml md sh"#,
result: None,
},
Example {
description: "Search strings for term(s)",
example: r#"echo Cargo.toml | find toml"#,
result: Some(vec![Value::from("Cargo.toml")]),
},
Example {
description: "Search a number list for term(s)",
example: r#"[1 2 3 4 5] | find 5"#,
result: Some(vec![UntaggedValue::int(5).into()]),
},
Example {
description: "Search string list for term(s)",
example: r#"[moe larry curly] | find l"#,
result: Some(vec![Value::from("larry"), Value::from("curly")]),
},
]
}
}
fn row_contains(row: &Dictionary, search_terms: Vec<String>) -> bool {
for term in search_terms {
for (k, v) in &row.entries {
let key = k.to_string().trim().to_lowercase();
let value = v.convert_to_string().trim().to_lowercase();
if key.contains(&term) || value.contains(&term) {
return true;
}
}
}
false
}
fn find(args: CommandArgs) -> Result<OutputStream, ShellError> {
let rest: Vec<Value> = args.rest(0)?;
Ok(args
.input
.filter(move |row| match &row.value {
UntaggedValue::Row(row) => {
let sterms: Vec<String> = rest
.iter()
.map(|t| t.convert_to_string().trim().to_lowercase())
.collect();
row_contains(row, sterms)
}
UntaggedValue::Primitive(_p) => {
// eprint!("prim {}", p.type_name());
let sterms: Vec<String> = rest
.iter()
.map(|t| t.convert_to_string().trim().to_lowercase())
.collect();
let prim_string = &row.convert_to_string().trim().to_lowercase();
for term in sterms {
if prim_string.contains(&term) {
return true;
}
}
false
}
_ => false,
})
.into_output_stream())
}
#[cfg(test)]
mod tests {
use super::Find;
use super::ShellError;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test as test_examples;
test_examples(Find {})
}
}

View File

@ -18,7 +18,11 @@ impl WholeStreamCommand for Help {
fn signature(&self) -> Signature {
Signature::build("help")
.rest(SyntaxShape::String, "the name of command to get help on")
.rest(
"rest",
SyntaxShape::String,
"the name of command to get help on",
)
.named(
"find",
SyntaxShape::String,
@ -298,7 +302,7 @@ pub fn signature_dict(signature: Signature, tag: impl Into<Tag>) -> Value {
let tag = tag.into();
let mut sig = TaggedListBuilder::new(&tag);
for arg in signature.positional.iter() {
for arg in &signature.positional {
let is_required = matches!(arg.0, PositionalType::Mandatory(_, _));
sig.push_value(for_spec(arg.0.name(), "argument", is_required, &tag));
@ -309,7 +313,7 @@ pub fn signature_dict(signature: Signature, tag: impl Into<Tag>) -> Value {
sig.push_value(for_spec("rest", "argument", is_required, &tag));
}
for (name, ty) in signature.named.iter() {
for (name, ty) in &signature.named {
match ty.0 {
NamedType::Mandatory(_, _) => sig.push_value(for_spec(name, "flag", true, &tag)),
NamedType::Optional(_, _) => sig.push_value(for_spec(name, "flag", false, &tag)),

View File

@ -100,18 +100,15 @@ fn if_command(args: CommandArgs) -> Result<OutputStream, ShellError> {
context.scope.add_vars(&condition.captured.entries);
//FIXME: should we use the scope that's brought in as well?
let condition = evaluate_baseline_expr(cond, &*context);
match condition {
let condition = evaluate_baseline_expr(cond, &context);
let result = match condition {
Ok(condition) => match condition.as_bool() {
Ok(b) => {
let result = if b {
run_block(&then_case.block, &*context, input, external_redirection)
if b {
run_block(&then_case.block, &context, input, external_redirection)
} else {
run_block(&else_case.block, &*context, input, external_redirection)
};
context.scope.exit_scope();
result
run_block(&else_case.block, &context, input, external_redirection)
}
}
Err(e) => Ok(OutputStream::from_stream(
vec![UntaggedValue::Error(e).into_untagged_value()].into_iter(),
@ -120,13 +117,16 @@ fn if_command(args: CommandArgs) -> Result<OutputStream, ShellError> {
Err(e) => Ok(OutputStream::from_stream(
vec![UntaggedValue::Error(e).into_untagged_value()].into_iter(),
)),
}
};
context.scope.exit_scope();
result
}
#[cfg(test)]
mod tests {
use super::If;
use super::ShellError;
use nu_test_support::nu;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
@ -134,4 +134,21 @@ mod tests {
test_examples(If {})
}
#[test]
fn if_doesnt_leak_on_error() {
let actual = nu!(
".",
r#"
def test-leak [] {
let var = "hello"
if 0 == "" {echo ok} {echo not}
}
test-leak
echo $var
"#
);
assert!(actual.err.contains("unknown variable"));
}
}

View File

@ -4,6 +4,8 @@ mod def;
mod describe;
mod do_;
pub(crate) mod echo;
mod error;
mod find;
mod help;
mod history;
mod if_;
@ -27,6 +29,8 @@ pub use def::Def;
pub use describe::Describe;
pub use do_::Do;
pub use echo::Echo;
pub use error::*;
pub use find::Find;
pub use help::Help;
pub use history::History;
pub use if_::If;

View File

@ -4,7 +4,7 @@ use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_path::canonicalize;
use nu_path::canonicalize_with;
use nu_protocol::{CommandAction, ReturnSuccess, Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
@ -56,7 +56,7 @@ impl WholeStreamCommand for SubCommand {
tag,
}) = load_path
{
let path = canonicalize(shell_manager.path(), load_path).map_err(|_| {
let path = canonicalize_with(load_path, shell_manager.path()).map_err(|_| {
ShellError::labeled_error(
"Cannot load plugins from directory",
"directory not found",

View File

@ -15,6 +15,7 @@ impl WholeStreamCommand for Command {
.switch("skip-plugins", "do not load plugins", None)
.switch("no-history", "don't save history", None)
.switch("perf", "show startup performance metrics", None)
.switch("login", "start Nu as if it was a login shell", Some('l'))
.named(
"commands",
SyntaxShape::String,
@ -33,7 +34,7 @@ impl WholeStreamCommand for Command {
"loglevel",
SyntaxShape::String,
"LEVEL: error, warn, info, debug, trace",
Some('l'),
None,
)
.named(
"config-file",
@ -41,7 +42,7 @@ impl WholeStreamCommand for Command {
"custom configuration source file",
None,
)
.rest(SyntaxShape::String, "source file(s) to run")
.rest("rest", SyntaxShape::String, "source file(s) to run")
}
fn usage(&self) -> &str {

View File

@ -2,11 +2,11 @@ use crate::prelude::*;
use nu_engine::{script, WholeStreamCommand};
use nu_errors::ShellError;
use nu_path::expand_path;
use nu_path::{canonicalize, canonicalize_with};
use nu_protocol::{Signature, SyntaxShape};
use nu_source::Tagged;
use std::{borrow::Cow, path::Path};
use std::path::Path;
pub struct Source;
@ -32,7 +32,7 @@ impl WholeStreamCommand for Source {
"Runs a script file in the current context."
}
fn run_with_actions(&self, args: CommandArgs) -> Result<ActionStream, ShellError> {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
source(args)
}
@ -41,14 +41,66 @@ impl WholeStreamCommand for Source {
}
}
pub fn source(args: CommandArgs) -> Result<ActionStream, ShellError> {
pub fn source(args: CommandArgs) -> Result<OutputStream, ShellError> {
let ctx = &args.context;
let filename: Tagged<String> = args.req(0)?;
let source_file = Path::new(&filename.item);
// Note: this is a special case for setting the context from a command
// In this case, if we don't set it now, we'll lose the scope that this
// variable should be set into.
let contents = std::fs::read_to_string(&expand_path(Cow::Borrowed(Path::new(&filename.item))));
let lib_dirs = &ctx
.configs()
.lock()
.global_config
.as_ref()
.map(|configuration| match configuration.var("lib_dirs") {
Some(paths) => paths
.table_entries()
.cloned()
.map(|path| path.as_string())
.collect(),
None => vec![],
});
if let Some(dir) = lib_dirs {
for lib_path in dir {
match lib_path {
Ok(name) => {
let path = if let Ok(p) = canonicalize_with(&source_file, name) {
p
} else {
continue;
};
if let Ok(contents) = std::fs::read_to_string(path) {
let result = script::run_script_standalone(contents, true, ctx, false);
if let Err(err) = result {
ctx.error(err);
}
return Ok(OutputStream::empty());
}
}
Err(reason) => {
ctx.error(reason.clone());
}
}
}
}
let path = canonicalize(source_file).map_err(|e| {
ShellError::labeled_error(
format!("Can't load source file. Reason: {}", e),
"Can't load this file",
filename.span(),
)
})?;
let contents = std::fs::read_to_string(path);
match contents {
Ok(contents) => {
let result = script::run_script_standalone(contents, true, ctx, false);
@ -56,16 +108,16 @@ pub fn source(args: CommandArgs) -> Result<ActionStream, ShellError> {
if let Err(err) = result {
ctx.error(err);
}
Ok(ActionStream::empty())
Ok(OutputStream::empty())
}
Err(_) => {
Err(e) => {
ctx.error(ShellError::labeled_error(
"Can't load file to source",
"can't load file",
format!("Can't load source file. Reason: {}", e),
"Can't load this file",
filename.span(),
));
Ok(ActionStream::empty())
Ok(OutputStream::empty())
}
}
}

View File

@ -1,7 +1,7 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{Signature, TaggedDictBuilder, UntaggedValue};
use nu_protocol::{Primitive, Signature, TaggedDictBuilder, UntaggedValue, Value};
pub struct Tags;
@ -18,37 +18,64 @@ impl WholeStreamCommand for Tags {
"Read the tags (metadata) for values."
}
fn run_with_actions(&self, args: CommandArgs) -> Result<ActionStream, ShellError> {
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
Ok(tags(args))
}
}
fn tags(args: CommandArgs) -> ActionStream {
args.input
.map(move |v| {
let mut tags = TaggedDictBuilder::new(v.tag());
{
let anchor = v.anchor();
let span = v.tag.span;
let mut dict = TaggedDictBuilder::new(v.tag());
dict.insert_untagged("start", UntaggedValue::int(span.start() as i64));
dict.insert_untagged("end", UntaggedValue::int(span.end() as i64));
tags.insert_value("span", dict.into_value());
fn build_tag_table(tag: impl Into<Tag>) -> Value {
let tag = tag.into();
let span = tag.span;
match anchor {
Some(AnchorLocation::File(source)) => {
tags.insert_untagged("anchor", UntaggedValue::string(source));
}
Some(AnchorLocation::Url(source)) => {
tags.insert_untagged("anchor", UntaggedValue::string(source));
}
_ => {}
}
TaggedDictBuilder::build(tag.clone(), |tags| {
if let Some(anchor) = anchor_as_value(&tag) {
tags.insert_value("anchor", anchor);
}
tags.insert_value(
"span",
TaggedDictBuilder::build(tag.clone(), |span_dict| {
span_dict.insert_untagged("start", UntaggedValue::int(span.start() as i64));
span_dict.insert_untagged("end", UntaggedValue::int(span.end() as i64));
}),
);
})
}
fn tags(args: CommandArgs) -> OutputStream {
if args.input.is_empty() {
OutputStream::one(build_tag_table(&args.name_tag()))
} else {
args.input
.map(move |v| build_tag_table(v.tag()))
.into_output_stream()
}
}
fn anchor_as_value(tag: &Tag) -> Option<Value> {
let anchor = tag.anchor.as_ref();
anchor.as_ref()?;
Some(TaggedDictBuilder::build(tag, |table| {
let value = match anchor {
Some(AnchorLocation::File(path)) => {
Some(("file", UntaggedValue::from(path.to_string())))
}
Some(AnchorLocation::Url(destination)) => {
Some(("url", UntaggedValue::from(destination.to_string())))
}
Some(AnchorLocation::Source(text)) => Some((
"source",
UntaggedValue::Primitive(Primitive::String(text.to_string())),
)),
None => None,
};
tags.into_value()
})
.into_action_stream()
if let Some((key, value)) = value {
table.insert_untagged(key, value);
}
}))
}
#[cfg(test)]

View File

@ -56,7 +56,7 @@ fn tutor(args: CommandArgs) -> Result<OutputStream, ShellError> {
let search: Option<String> = args.opt(0).unwrap_or(None);
let find: Option<String> = args.get_flag("find")?;
let search_space = vec![
let search_space = [
(vec!["begin"], begin_tutor()),
(
vec!["table", "tables", "row", "rows", "column", "columns"],
@ -81,6 +81,7 @@ fn tutor(args: CommandArgs) -> Result<OutputStream, ShellError> {
vec!["var", "vars", "variable", "variables"],
variable_tutor(),
),
(vec!["engine-q", "e-q"], engineq_tutor()),
(vec!["block", "blocks"], block_tutor()),
(vec!["shorthand", "shorthands"], shorthand_tutor()),
];
@ -88,7 +89,7 @@ fn tutor(args: CommandArgs) -> Result<OutputStream, ShellError> {
if let Some(find) = find {
let mut results = vec![];
for search_group in search_space {
if search_group.1.contains(&find.as_str()) {
if search_group.1.contains(&find) {
results.push(search_group.0[0].to_string())
}
}
@ -166,7 +167,7 @@ This will get the 3rd (note that `nth` is zero-based) row in the table created
by the `ls` command. You can use `nth` on any table created by other commands
as well.
You can also access the column of data in one of two ways. If you want to want
You can also access the column of data in one of two ways. If you want
to keep the column as part of a new table, you can use `select`.
```
ls | select name
@ -274,7 +275,7 @@ This can be helpful if you want to later processes these values.
The `echo` command can pair well with the `each` command which can run
code on each row, or item, of input.
You can continue to learn more about the `echo` command by running:
You can continue to learn more about the `each` command by running:
```
tutor each
```
@ -370,6 +371,29 @@ same value using:
"#
}
fn engineq_tutor() -> &'static str {
r#"
Engine-q is the upcoming engine for Nushell. Build for speed and correctness,
it also comes with a set of changes from Nushell versions prior to 0.60. To
get ready for engine-q look for some of these changes that might impact your
current scripts:
* Engine-q now uses a few new data structures, including a record syntax
that allows you to model key-value pairs similar to JSON objects.
* Environment variables can now contain more than just strings. Structured
values are converted to strings for external commands using converters.
* `if` will now use an `else` keyword before the else block.
* We're moving from "config.toml" to "config.nu". This means startup will
now be a script file.
* `config` and its subcommands are being replaced by a record that you can
update in the shell which contains all the settings under the variable
`$config`.
* bigint/bigdecimal values are now machine i64 and f64 values
* And more, you can read more about upcoming changes in the up-to-date list
at: https://github.com/nushell/engine-q/issues/522
"#
}
fn display(tag: Tag, scope: &Scope, help: &str) -> OutputStream {
let help = help.split('`');
@ -383,7 +407,7 @@ fn display(tag: Tag, scope: &Scope, help: &str) -> OutputStream {
//TODO: support no-color mode
let colored_example = nu_engine::Painter::paint_string(item, scope, &palette);
build.push_str(&format!("{}", colored_example));
build.push_str(&colored_example);
} else {
code_mode = true;
build.push_str(item);

View File

@ -221,11 +221,6 @@ fn features_enabled() -> Vec<String> {
names.push("zip".to_string());
}
#[cfg(feature = "clipboard-cli")]
{
names.push("clipboard-cli".to_string());
}
#[cfg(feature = "trash-support")]
{
names.push("trash".to_string());

View File

@ -245,7 +245,7 @@ fn perform_groupby_aggregation(
None => &col[..],
};
res.rename(col.as_str(), new_col)
res.rename(&col, new_col)
.expect("Column is always there. Looping with known names");
}
}

View File

@ -107,7 +107,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let other: Value = args.req_named("other")?;
let axis: Tagged<String> = args.req_named("axis")?;
let axis = Axis::try_from_str(axis.item.as_str(), &axis.tag.span)?;
let axis = Axis::try_from_str(&axis.item, &axis.tag.span)?;
let df_other = match other.value {
UntaggedValue::DataFrame(df) => Ok(df),

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let res = df
.as_ref()
.column(column.item.as_ref())
.column(&column.item)
.map_err(|e| parse_polars_error::<&str>(&e, &column.tag.span, None))?;
let df = NuDataFrame::try_from_series(vec![res.clone()], &tag.span)?;

View File

@ -121,7 +121,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let tail = df.as_ref().get_columns().iter().map(|col| {
let count = col.len() as f64;
let sum = match col.sum_as_series().cast_with_dtype(&DataType::Float64) {
let sum = match col.sum_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -144,7 +144,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
_ => None,
};
let min = match col.min_as_series().cast_with_dtype(&DataType::Float64) {
let min = match col.min_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -153,7 +153,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
};
let q_25 = match col.quantile_as_series(0.25) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) {
Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -164,7 +164,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
};
let q_50 = match col.quantile_as_series(0.50) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) {
Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -175,7 +175,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
};
let q_75 = match col.quantile_as_series(0.75) {
Ok(ca) => match ca.cast_with_dtype(&DataType::Float64) {
Ok(ca) => match ca.cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -185,7 +185,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
Err(_) => None,
};
let max = match col.max_as_series().cast_with_dtype(&DataType::Float64) {
let max = match col.max_as_series().cast(&DataType::Float64) {
Ok(ca) => match ca.get(0) {
AnyValue::Float64(v) => Some(v),
_ => None,
@ -195,7 +195,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let name = format!("{} ({})", col.name(), col.dtype());
ChunkedArray::<Float64Type>::new_from_opt_slice(
name.as_str(),
&name,
&[
Some(count),
sum,

View File

@ -20,7 +20,11 @@ impl WholeStreamCommand for DataFrame {
}
fn signature(&self) -> Signature {
Signature::build("dataframe drop").rest(SyntaxShape::Any, "column names to be dropped")
Signature::build("dataframe drop").rest(
"rest",
SyntaxShape::Any,
"column names to be dropped",
)
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {

View File

@ -72,7 +72,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
.expect("using name from list of names from dataframe")
.dtype();
let dtype_str = format!("{}", dtype);
let dtype_str = dtype.to_string();
dtypes.push(Value {
value: dtype_str.into(),
tag: Tag::default(),

View File

@ -19,7 +19,11 @@ impl WholeStreamCommand for DataFrame {
}
fn signature(&self) -> Signature {
Signature::build("dataframe get").rest(SyntaxShape::Any, "column names to sort dataframe")
Signature::build("dataframe get").rest(
"rest",
SyntaxShape::Any,
"column names to sort dataframe",
)
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {

View File

@ -20,7 +20,7 @@ impl WholeStreamCommand for DataFrame {
}
fn signature(&self) -> Signature {
Signature::build("dataframe group-by").rest(SyntaxShape::Any, "groupby columns")
Signature::build("dataframe group-by").rest("rest", SyntaxShape::Any, "groupby columns")
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {

View File

@ -44,6 +44,12 @@ impl WholeStreamCommand for DataFrame {
"type of join. Inner by default",
Some('t'),
)
.named(
"suffix",
SyntaxShape::String,
"suffix for the columns of the right dataframe",
Some('s'),
)
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
@ -104,6 +110,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let r_df: Value = args.req(0)?;
let l_col: Vec<Value> = args.req_named("left")?;
let r_col: Vec<Value> = args.req_named("right")?;
let r_suffix: Option<Tagged<String>> = args.get_flag("suffix")?;
let join_type_op: Option<Tagged<String>> = args.get_flag("type")?;
let join_type = match join_type_op {
@ -124,6 +131,8 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
},
};
let suffix = r_suffix.map(|s| s.item);
let (l_col_string, l_col_span) = convert_columns(&l_col, &tag)?;
let (r_col_string, r_col_span) = convert_columns(&r_col, &tag)?;
@ -142,7 +151,13 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
)?;
df.as_ref()
.join(r_df.as_ref(), &l_col_string, &r_col_string, join_type)
.join(
r_df.as_ref(),
&l_col_string,
&r_col_string,
join_type,
suffix,
)
.map_err(|e| parse_polars_error::<&str>(&e, &l_col_span, None))
}
_ => Err(ShellError::labeled_error(
@ -177,7 +192,7 @@ fn check_column_datatypes<T: AsRef<str>>(
));
}
for (l, r) in l_cols.iter().zip(r_cols.iter()) {
for (l, r) in l_cols.iter().zip(r_cols) {
let l_series = df_l
.column(l.as_ref())
.map_err(|e| parse_polars_error::<&str>(&e, l_col_span, None))?;

View File

@ -5,6 +5,7 @@ use nu_protocol::{
dataframe::{Column, NuDataFrame},
Signature, SyntaxShape, UntaggedValue, Value,
};
use nu_source::Tagged;
use super::utils::convert_columns;
@ -33,6 +34,18 @@ impl WholeStreamCommand for DataFrame {
"column names used as value columns",
Some('v'),
)
.named(
"variable_name",
SyntaxShape::String,
"optional name for variable column",
Some('r'),
)
.named(
"value_name",
SyntaxShape::String,
"optional name for value column",
Some('l'),
)
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
@ -105,6 +118,9 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let id_col: Vec<Value> = args.req_named("columns")?;
let val_col: Vec<Value> = args.req_named("values")?;
let value_name: Option<Tagged<String>> = args.get_flag("value_name")?;
let variable_name: Option<Tagged<String>> = args.get_flag("variable_name")?;
let (id_col_string, id_col_span) = convert_columns(&id_col, &tag)?;
let (val_col_string, val_col_span) = convert_columns(&val_col, &tag)?;
@ -113,11 +129,21 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
check_column_datatypes(df.as_ref(), &id_col_string, &id_col_span)?;
check_column_datatypes(df.as_ref(), &val_col_string, &val_col_span)?;
let res = df
let mut res = df
.as_ref()
.melt(&id_col_string, &val_col_string)
.map_err(|e| parse_polars_error::<&str>(&e, &tag.span, None))?;
if let Some(name) = &variable_name {
res.rename("variable", &name.item)
.map_err(|e| parse_polars_error::<&str>(&e, &name.tag.span, None))?;
}
if let Some(name) = &value_name {
res.rename("value", &name.item)
.map_err(|e| parse_polars_error::<&str>(&e, &name.tag.span, None))?;
}
Ok(OutputStream::one(NuDataFrame::dataframe_to_value(res, tag)))
}

View File

@ -18,6 +18,7 @@ pub mod list;
pub mod melt;
pub mod open;
pub mod pivot;
pub mod rename;
pub mod sample;
pub mod select;
pub mod shape;
@ -52,6 +53,7 @@ pub use list::DataFrame as DataFrameList;
pub use melt::DataFrame as DataFrameMelt;
pub use open::DataFrame as DataFrameOpen;
pub use pivot::DataFrame as DataFramePivot;
pub use rename::DataFrame as DataFrameRename;
pub use sample::DataFrame as DataFrameSample;
pub use select::DataFrame as DataFrameSelect;
pub use shape::DataFrame as DataFrameShape;

View File

@ -8,7 +8,7 @@ use nu_protocol::{
};
use nu_source::Tagged;
use polars::prelude::{CsvEncoding, CsvReader, JsonReader, ParquetReader, PolarsError, SerReader};
use polars::prelude::{CsvEncoding, CsvReader, JsonReader, ParquetReader, SerReader};
use std::fs::File;
pub struct DataFrame;
@ -206,15 +206,6 @@ fn from_csv(args: CommandArgs) -> Result<polars::prelude::DataFrame, ShellError>
match csv_reader.finish() {
Ok(df) => Ok(df),
Err(e) => match e {
PolarsError::Other(_) => Err(ShellError::labeled_error_with_secondary(
"Schema error",
"Error with the inferred schema",
&file.tag.span,
"You can use the argument 'infer_schema' with a number of rows large enough to better infer the schema",
&file.tag.span,
)),
_ => Err(parse_polars_error::<&str>(&e, &file.tag.span, None)),
},
Err(e) => Err(parse_polars_error::<&str>(&e, &file.tag.span, None)),
}
}

View File

@ -100,7 +100,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let mut groupby = nu_groupby.to_groupby()?;
let pivot = groupby.pivot(pivot_col.item.as_ref(), value_col.item.as_ref());
let pivot = groupby.pivot(&pivot_col.item, &value_col.item);
let res = match op {
Operation::Mean => pivot.mean(),
@ -120,7 +120,7 @@ fn check_pivot_column(
col: &Tagged<String>,
) -> Result<(), ShellError> {
let series = df
.column(col.item.as_ref())
.column(&col.item)
.map_err(|e| parse_polars_error::<&str>(&e, &col.tag.span, None))?;
match series.dtype() {
@ -146,7 +146,7 @@ fn check_value_column(
col: &Tagged<String>,
) -> Result<(), ShellError> {
let series = df
.column(col.item.as_ref())
.column(&col.item)
.map_err(|e| parse_polars_error::<&str>(&e, &col.tag.span, None))?;
match series.dtype() {

View File

@ -0,0 +1,81 @@
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_errors::ShellError;
use nu_protocol::{
dataframe::{Column, NuDataFrame},
Signature, SyntaxShape, UntaggedValue,
};
use nu_source::Tagged;
use super::utils::parse_polars_error;
pub struct DataFrame;
impl WholeStreamCommand for DataFrame {
fn name(&self) -> &str {
"dataframe rename-col"
}
fn usage(&self) -> &str {
"[DataFrame] rename a dataframe column"
}
fn signature(&self) -> Signature {
Signature::build("dataframe rename-col")
.required("from", SyntaxShape::String, "column name to be renamed")
.required("to", SyntaxShape::String, "new column name")
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
command(args)
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Renames a dataframe column",
example: "[[a b]; [1 2] [3 4]] | dataframe to-df | dataframe rename-col a ab",
result: Some(vec![NuDataFrame::try_from_columns(
vec![
Column::new(
"ab".to_string(),
vec![UntaggedValue::int(1).into(), UntaggedValue::int(3).into()],
),
Column::new(
"b".to_string(),
vec![UntaggedValue::int(2).into(), UntaggedValue::int(4).into()],
),
],
&Span::default(),
)
.expect("simple df for test should not fail")
.into_value(Tag::default())]),
}]
}
}
fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let from: Tagged<String> = args.req(0)?;
let to: Tagged<String> = args.req(1)?;
let (mut df, df_tag) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?;
df.as_mut()
.rename(&from.item, &to.item)
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
Ok(OutputStream::one(df.into_value(tag)))
}
#[cfg(test)]
mod tests {
use super::DataFrame;
use super::ShellError;
#[test]
fn examples_work_as_expected() -> Result<(), ShellError> {
use crate::examples::test_dataframe as test_examples;
test_examples(DataFrame {})
}
}

View File

@ -20,7 +20,7 @@ impl WholeStreamCommand for DataFrame {
}
fn signature(&self) -> Signature {
Signature::build("dataframe select").rest(SyntaxShape::Any, "selected column names")
Signature::build("dataframe select").rest("rest", SyntaxShape::Any, "selected column names")
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {

View File

@ -68,7 +68,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
})?;
let res = chunked
.contains(pattern.as_str())
.contains(&pattern.item)
.map_err(|e| parse_polars_error::<&str>(&e, &tag.span, None))?;
let df = NuDataFrame::try_from_series(vec![res.into_series()], &tag.span)?;

View File

@ -99,11 +99,11 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
));
}
let cum_type = CumType::from_str(cum_type.item.as_str(), &cum_type.tag.span)?;
let cum_type = CumType::from_str(&cum_type.item, &cum_type.tag.span)?;
let mut res = match cum_type {
CumType::Max => series.cum_max(reverse),
CumType::Min => series.cum_min(reverse),
CumType::Sum => series.cum_sum(reverse),
CumType::Max => series.cummax(reverse),
CumType::Min => series.cummin(reverse),
CumType::Sum => series.cumsum(reverse),
};
let name = format!("{}_{}", series.name(), cum_type.to_str());

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.day().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.hour().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.minute().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.month().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.nanosecond().into_series();

View File

@ -56,7 +56,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.ordinal().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.second().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.week().into_series();

View File

@ -53,7 +53,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.weekday().into_series();

View File

@ -56,7 +56,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.year().into_series();

View File

@ -60,7 +60,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let mut series = df.as_series(&df_tag.span)?;
series.rename(name.item.as_ref());
series.rename(&name.item);
let df = NuDataFrame::try_from_series(vec![series], &tag.span)?;
Ok(OutputStream::one(df.into_value(df_tag)))

View File

@ -77,7 +77,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
})?;
let mut res = chunked
.replace(pattern.as_str(), replace.as_str())
.replace(&pattern.item, &replace.item)
.map_err(|e| parse_polars_error::<&str>(&e, &tag.span, None))?;
res.rename(series.name());

View File

@ -77,7 +77,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
})?;
let mut res = chunked
.replace_all(pattern.as_str(), replace.as_str())
.replace_all(&pattern.item, &replace.item)
.map_err(|e| parse_polars_error::<&str>(&e, &tag.span, None))?;
res.rename(series.name());

View File

@ -6,7 +6,7 @@ use nu_protocol::{
Signature, SyntaxShape, UntaggedValue,
};
use nu_source::Tagged;
use polars::prelude::DataType;
use polars::prelude::{DataType, RollingOptions};
enum RollType {
Min,
@ -57,7 +57,6 @@ impl WholeStreamCommand for DataFrame {
Signature::build("dataframe rolling")
.required("type", SyntaxShape::String, "rolling operation")
.required("window", SyntaxShape::Int, "Window size for rolling")
.switch("ignore_nulls", "Ignore nulls in column", Some('i'))
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {
@ -112,7 +111,6 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let roll_type: Tagged<String> = args.req(0)?;
let window_size: Tagged<i64> = args.req(1)?;
let ignore_nulls = args.has_flag("ignore_nulls");
let (df, df_tag) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?;
let series = df.as_series(&df_tag.span)?;
@ -125,32 +123,18 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
));
}
let roll_type = RollType::from_str(roll_type.item.as_str(), &roll_type.tag.span)?;
let roll_type = RollType::from_str(&roll_type.item, &roll_type.tag.span)?;
let rolling_opts = RollingOptions {
window_size: window_size.item as usize,
min_periods: window_size.item as usize,
weights: None,
center: false,
};
let res = match roll_type {
RollType::Max => series.rolling_max(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Min => series.rolling_min(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Sum => series.rolling_sum(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Mean => series.rolling_mean(
window_size.item as u32,
None,
ignore_nulls,
window_size.item as u32,
),
RollType::Max => series.rolling_max(rolling_opts),
RollType::Min => series.rolling_min(rolling_opts),
RollType::Sum => series.rolling_sum(rolling_opts),
RollType::Mean => series.rolling_mean(rolling_opts),
};
let mut res = res.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;

View File

@ -78,7 +78,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let casted = match indices.dtype() {
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => indices
.as_ref()
.cast_with_dtype(&DataType::UInt32)
.cast(&DataType::UInt32)
.map_err(|e| parse_polars_error::<&str>(&e, &value.tag.span, None)),
_ => Err(ShellError::labeled_error_with_secondary(
"Incorrect type",

View File

@ -58,10 +58,10 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let series = df.as_series(&df_tag.span)?;
let casted = series
.date64()
.datetime()
.map_err(|e| parse_polars_error::<&str>(&e, &df_tag.span, None))?;
let res = casted.strftime(fmt.item.as_str()).into_series();
let res = casted.strftime(&fmt.item).into_series();
let df = NuDataFrame::try_from_series(vec![res], &tag.span)?;
Ok(OutputStream::one(df.into_value(df_tag)))
}

View File

@ -21,7 +21,7 @@ impl WholeStreamCommand for DataFrame {
fn signature(&self) -> Signature {
Signature::build("dataframe sort")
.switch("reverse", "invert sort", Some('r'))
.rest(SyntaxShape::Any, "column names to sort dataframe")
.rest("rest", SyntaxShape::Any, "column names to sort dataframe")
}
fn run(&self, args: CommandArgs) -> Result<OutputStream, ShellError> {

View File

@ -92,7 +92,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let casted = match series.dtype() {
DataType::UInt32 | DataType::UInt64 | DataType::Int32 | DataType::Int64 => series
.as_ref()
.cast_with_dtype(&DataType::UInt32)
.cast(&DataType::UInt32)
.map_err(|e| parse_polars_error::<&str>(&e, &value.tag.span, None)),
_ => Err(ShellError::labeled_error_with_secondary(
"Incorrect type",

View File

@ -64,22 +64,18 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let delimiter: Option<Tagged<String>> = args.get_flag("delimiter")?;
let no_header: bool = args.has_flag("no_header");
let (mut df, _) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?;
let (df, _) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?;
let mut file = File::create(&file_name.item).map_err(|e| {
ShellError::labeled_error(
"Error with file name",
format!("{}", e),
&file_name.tag.span,
)
ShellError::labeled_error("Error with file name", e.to_string(), &file_name.tag.span)
})?;
let writer = CsvWriter::new(&mut file);
let writer = if no_header {
writer.has_headers(false)
writer.has_header(false)
} else {
writer.has_headers(true)
writer.has_header(true)
};
let writer = match delimiter {
@ -103,7 +99,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
};
writer
.finish(df.as_mut())
.finish(df.as_ref())
.map_err(|e| parse_polars_error::<&str>(&e, &file_name.tag.span, None))?;
let tagged_value = Value {

View File

@ -48,18 +48,14 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let file_name: Tagged<PathBuf> = args.req(0)?;
let (mut df, _) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?;
let (df, _) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?;
let file = File::create(&file_name.item).map_err(|e| {
ShellError::labeled_error(
"Error with file name",
format!("{}", e),
&file_name.tag.span,
)
ShellError::labeled_error("Error with file name", e.to_string(), &file_name.tag.span)
})?;
ParquetWriter::new(file)
.finish(df.as_mut())
.finish(df.as_ref())
.map_err(|e| parse_polars_error::<&str>(&e, &file_name.tag.span, None))?;
let tagged_value = Value {

View File

@ -46,30 +46,31 @@ pub(crate) fn parse_polars_error<T: AsRef<str>>(
span: &Span,
secondary: Option<T>,
) -> ShellError {
let (msg, label) = match e {
PolarsError::PolarsArrowError(_) => ("PolarsArrow Error", format!("{}", e)),
PolarsError::ArrowError(_) => ("Arrow Error", format!("{}", e)),
PolarsError::InvalidOperation(_) => ("Invalid Operation", format!("{}", e)),
PolarsError::DataTypeMisMatch(_) => ("Data Type Mismatch", format!("{}", e)),
PolarsError::NotFound(_) => ("Not Found", format!("{}", e)),
PolarsError::ShapeMisMatch(_) => ("Shape Mismatch", format!("{}", e)),
PolarsError::Other(_) => ("Other", format!("{}", e)),
PolarsError::OutOfBounds(_) => ("Out Of Bounds", format!("{}", e)),
PolarsError::NoSlice => ("No Slice", format!("{}", e)),
PolarsError::NoData(_) => ("No Data", format!("{}", e)),
PolarsError::ValueError(_) => ("Value Error", format!("{}", e)),
PolarsError::MemoryNotAligned => ("Memory Not Aligned", format!("{}", e)),
PolarsError::ParquetError(_) => ("Parquet Error", format!("{}", e)),
PolarsError::RandError(_) => ("Rand Error", format!("{}", e)),
PolarsError::HasNullValues(_) => ("Has Null Values", format!("{}", e)),
PolarsError::UnknownSchema(_) => ("Unknown Schema", format!("{}", e)),
PolarsError::Various(_) => ("Various", format!("{}", e)),
PolarsError::Io(_) => ("Io Error", format!("{}", e)),
PolarsError::Regex(_) => ("Regex Error", format!("{}", e)),
PolarsError::Duplicate(_) => ("Duplicate Error", format!("{}", e)),
PolarsError::ImplementationError => ("Implementation Error", format!("{}", e)),
let msg = match e {
PolarsError::PolarsArrowError(_) => "PolarsArrow Error",
PolarsError::ArrowError(_) => "Arrow Error",
PolarsError::InvalidOperation(_) => "Invalid Operation",
PolarsError::DataTypeMisMatch(_) => "Data Type Mismatch",
PolarsError::NotFound(_) => "Not Found",
PolarsError::ShapeMisMatch(_) => "Shape Mismatch",
PolarsError::ComputeError(_) => "Computer error",
PolarsError::OutOfBounds(_) => "Out Of Bounds",
PolarsError::NoSlice => "No Slice",
PolarsError::NoData(_) => "No Data",
PolarsError::ValueError(_) => "Value Error",
PolarsError::MemoryNotAligned => "Memory Not Aligned",
PolarsError::RandError(_) => "Rand Error",
PolarsError::HasNullValues(_) => "Has Null Values",
PolarsError::UnknownSchema(_) => "Unknown Schema",
PolarsError::Various(_) => "Various",
PolarsError::Io(_) => "Io Error",
PolarsError::Regex(_) => "Regex Error",
PolarsError::Duplicate(_) => "Duplicate Error",
PolarsError::ImplementationError => "Implementation Error",
};
let label = e.to_string();
match secondary {
None => ShellError::labeled_error(msg, label, span),
Some(s) => ShellError::labeled_error_with_secondary(msg, label, span, s.as_ref(), span),

View File

@ -82,7 +82,7 @@ fn command(mut args: CommandArgs) -> Result<OutputStream, ShellError> {
let mut series = df.as_series(&value.tag.span)?;
let series = series.rename(name.item.as_ref()).clone();
let series = series.rename(&name.item).clone();
let (mut df, _) = NuDataFrame::try_from_stream(&mut args.input, &tag.span)?;

View File

@ -31,12 +31,12 @@ impl WholeStreamCommand for AutoenvTrust {
value: UntaggedValue::Primitive(Primitive::String(ref path)),
tag: _,
}) => {
let mut dir = fs::canonicalize(path)?;
let mut dir = nu_path::canonicalize(path)?;
dir.push(".nu-env");
dir
}
_ => {
let mut dir = fs::canonicalize(std::env::current_dir()?)?;
let mut dir = nu_path::canonicalize(std::env::current_dir()?)?;
dir.push(".nu-env");
dir
}

View File

@ -30,7 +30,7 @@ impl WholeStreamCommand for AutoenvUntrust {
value: UntaggedValue::Primitive(Primitive::String(ref path)),
tag: _,
}) => {
let mut dir = fs::canonicalize(path)?;
let mut dir = nu_path::canonicalize(path)?;
dir.push(".nu-env");
dir
}
@ -61,7 +61,7 @@ impl WholeStreamCommand for AutoenvUntrust {
let mut doc = String::new();
file.read_to_string(&mut doc)?;
let mut allowed: Trusted = toml::from_str(doc.as_str()).unwrap_or_else(|_| Trusted::new());
let mut allowed: Trusted = toml::from_str(&doc).unwrap_or_else(|_| Trusted::new());
let file_to_untrust = file_to_untrust.to_string_lossy().to_string();

View File

@ -1,5 +1,7 @@
use std::convert::TryInto;
use crate::prelude::*;
use nu_engine::{evaluate_baseline_expr, WholeStreamCommand};
use nu_engine::{evaluate_baseline_expr, EnvVar, WholeStreamCommand};
use nu_errors::ShellError;
use nu_protocol::{hir::CapturedBlock, hir::ClassifiedCommand, Signature, SyntaxShape};
@ -90,9 +92,7 @@ pub fn set_env(args: CommandArgs) -> Result<ActionStream, ShellError> {
ctx.scope.exit_scope();
let value = value?;
let value = value.as_string()?;
let value: EnvVar = value?.try_into()?;
let name = name.item;
// Note: this is a special case for setting the context from a command

View File

@ -1,5 +1,7 @@
use std::convert::TryInto;
use crate::prelude::*;
use nu_engine::WholeStreamCommand;
use nu_engine::{EnvVar, WholeStreamCommand};
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value};
@ -60,14 +62,17 @@ fn load_env_from_table(
for (key, value) in value.row_entries() {
if key == "name" {
var_name = Some(value.as_string()?);
var_name = Some(value);
} else if key == "value" {
var_value = Some(value.as_string()?);
var_value = Some(value);
}
}
match (var_name, var_value) {
(Some(name), Some(value)) => ctx.scope.add_env_var(name, value),
(Some(name), Some(value)) => {
let env_var: EnvVar = value.try_into()?;
ctx.scope.add_env_var(name.as_string()?, env_var);
}
_ => {
return Err(ShellError::labeled_error(
r#"Expected each row in the table to have a "name" and "value" field."#,

Some files were not shown because too many files have changed in this diff Show More