Compare commits

...

1252 Commits

Author SHA1 Message Date
0522023d4c Bump to 0.16.0 (#2084) 2020-07-01 06:25:09 +12:00
9876169f5d Add dice subcommand to random command (#2082)
* Add dice subcommand to random command

* Update random dice test name

* Stream results of random dice

* Thanks Clippy!
2020-06-30 16:12:51 +12:00
ed10aafa6f Bubble errors even if pipeline isn't used (#2080) 2020-06-30 05:39:11 +12:00
bcddeb3c1f WIP (#2077) 2020-06-29 09:06:05 +12:00
3f170c7fb8 Cal improvements (#2074)
* .get() already checks for the argument, don't need to use .has() as well

* Supplying the month-names flag should also cause the months column to show up, it should not require the -m flag first
2020-06-29 05:16:10 +12:00
8d91d151bf added raw to date for string output (#2075) 2020-06-28 09:01:13 -05:00
821d44af54 Docs autoview pwd touch (#2068)
* [ADD] Add draft documentation for autoview

* [ADD] Add draft documentation for pwd

* [ADD] Add draft documentation for touch

* [MOD] Improve description and add examples

Add the use of `textview` and `binaryview`.
Add examples for single value, source file and binary file.
2020-06-28 14:22:26 +12:00
a30901ff7d added ability to request the date in a format (#2073) 2020-06-27 20:36:15 -05:00
94a1968a88 Add command for printing special characters (#2072) 2020-06-28 09:46:30 +12:00
dffc9c9b1c Properly redirect invocations (#2070)
* Properly redirect invocations

* Don't convert with-env yet, as there's a random test failure
2020-06-28 09:04:57 +12:00
8b3964f518 More ansi (#2067)
* added a few more options for ansi colors
not sure italic works, maybe it's font dependent

* fmt
2020-06-27 10:29:09 -05:00
7fed9992c9 Bump deps and touchup (#2066) 2020-06-27 19:54:31 +12:00
4e2a4236f8 Fix it expansion and add collect (#2065) 2020-06-27 17:38:19 +12:00
05781607f4 Configurable built-in prompts (#2064)
* Add ansi, do, and prompt customization

* Fix test

* Cleanups
2020-06-27 10:37:31 +12:00
6daec399e6 added match plugin and readme.txt for msi (#2063) 2020-06-26 08:39:19 -05:00
306dc89ede Add bool subcommand to random (#2061)
* Add bool subcommand to random

* Fix function name copy paste error

* Fix issue 2062: allow deserialization of a decimal

* Add bias flag to `random bool`
2020-06-26 16:51:05 +12:00
80ce8acf57 Add ThemedPalette (#1873)
* add theme module

* reorganize theme palette code

* improve tests

* move to newtype implementation for ThemeColor and fix Palette name.

* add dead code ignore for now

* fix allow dead code macro

* remove redundant import and unnecessary return

* fix ok_or clippy error

Co-authored-by: Kurtis Nusbaum <kcommiter@gmail.com>
2020-06-26 16:40:12 +12:00
8dfc90a322 Update release.yml 2020-06-26 15:55:18 +12:00
ad5e485594 Update release.yml 2020-06-26 15:24:45 +12:00
60ed40f8bd Update release.yml 2020-06-26 14:34:39 +12:00
a6228cab9e Update main.wxs (#2060) 2020-06-26 11:34:39 +12:00
1857ac69d1 updated wxs to have the right exes (#2059) 2020-06-25 17:38:32 -05:00
e33e80ab24 Update release.yml 2020-06-26 09:40:59 +12:00
d18bc78e7c Update release.yml 2020-06-26 09:28:09 +12:00
3b2a87b6d4 Update release.yml 2020-06-26 09:08:20 +12:00
62c76be7ca Update release.yml 2020-06-26 09:06:33 +12:00
733f93e673 update to make closer to volta's (#2058) 2020-06-26 08:08:59 +12:00
2c88b2fae7 Gh actions with wix (#2057)
* Added wix to gh workflow
Followed volta example

* added --nocapture to see more error detail

* move creation of wix to after we download less.exe

* moved create wix down
2020-06-26 07:30:07 +12:00
501da433d4 Gh actions with wix, added --nocapture (#2056)
* Added wix to gh workflow
Followed volta example

* added --nocapture to see more error detail
2020-06-26 06:26:48 +12:00
0e8a239ae1 Added wix to gh workflow (#2055)
Followed volta example
2020-06-26 05:51:50 +12:00
bb08a221e2 Wix addition for creating msi (#2054)
* WIP - wix

* Updated wxs to have less. Added less.exe.

* removed binary less.exe since we're downloading it
updated wxs to point to output/* for less.exe

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-26 05:50:45 +12:00
0dbe347f84 Update Cargo.lock (#2053) 2020-06-25 19:46:20 +12:00
72a21ad619 Adds random command with uuid subcommand (#2050) 2020-06-25 17:51:09 +12:00
6372d2a18c Made starship usage configurable (#2049)
Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-25 17:44:55 +12:00
4468947ad4 Add release automation with GitHub Actions (#2048) 2020-06-25 11:43:25 +12:00
93144a0132 Make mode subcommand: math mode (#2043)
* Update calculate to return a table when Value is a table

* impl mode subcommand for math

* add tests for math mode subcommand

* add table/row tests for math mode subcommand

* fix formatting
2020-06-25 05:57:27 +12:00
72f7406057 Fix cal command week-start example (#2040) 2020-06-24 17:15:19 +12:00
c56cbd0f6b Cleanup header to work like before (#2039) 2020-06-24 06:51:06 +12:00
1420cbafe4 Refactor out RunnableContext from calculate (#2037) 2020-06-24 06:24:33 +12:00
053bd926ec First pass at updating all documentation formatting and cleaning up output of examples (#2031) 2020-06-24 06:21:47 +12:00
d095cb91e4 bump which from 3 to 4.0.1 (#2035)
This release should work with reparse points like WinGet.
2020-06-22 12:58:10 -05:00
e8476d8fbb removed some comments (#2032)
removed comments that i shouldn't have left in
2020-06-22 08:30:43 -05:00
7532618bdc Add error for division by zero using "=" (#2002) (#2030)
* error for division by zero with =

* cargo fmt

* add helper zero_division_error

* tests for zero division error

* fix test names (zero div error)
2020-06-22 08:50:43 +12:00
e3e1e6f81b Adds a test case for changing drives using drive letter in Windows (#2022)
* added test case to test check switching drives using drive letter in windows

* modified test case to not use config, assert file exists
2020-06-22 07:02:58 +12:00
bce6f5a3e6 Uniq: --count flag to count occurences (#2017)
* uniq: Add counting option (WIP!)

Usage:

fetch https://raw.githubusercontent.com/timbray/topfew/master/test/data/access-1k | lines | wrap item | uniq | sort-by count | last 10

* uniq: Add first test

* uniq: Re-enable the non-counting variant.

* uniq: Also handle primitive lines.

* uniq: Update documentation

* uniq: Final comment about error handling. Let's get some feedback

* uniq: Address review comments.

Not happy with the way I create a TypeError. There must be a cleaner
way. Anyway, good for shipping.

* uniq: Use Labeled_error as suggested by jturner in chat.

* uniq: Return error directly.

Co-authored-by: Christoph Siedentop <christoph@siedentop.name>
2020-06-21 12:22:06 +12:00
480600c465 Fix wrap of carriage returns in cells (#2027)
* Fix carriage returns in cells

* Fix carriage returns in cells
2020-06-21 09:33:58 +12:00
89c737f456 Finish move to nu-table (#2025) 2020-06-21 07:25:07 +12:00
4e83363dd3 Upgrade heim to 0.1.0-beta.3 (#2019) 2020-06-21 06:55:16 +12:00
de6d8738c4 Simplify textview match (#2024)
* simplify textview match code

* Math median tests and documentation additions (#2018)

* Add math median example and unit tests

* Update output of other all math ls command examples to keep consistent with math median output

* Fix output of math max example

* Update output of other math commands using pwd examples to keep data consistent
2020-06-20 12:16:36 -05:00
853d7e7120 Math median tests and documentation additions (#2018)
* Add math median example and unit tests

* Update output of other all math ls command examples to keep consistent with math median output

* Fix output of math max example

* Update output of other math commands using pwd examples to keep data consistent
2020-06-20 00:28:03 -05:00
b0c30098e4 Sort primitives explictly. (#2016)
* Sort primitives explictly.

* Write backing up test.
2020-06-19 23:34:36 -05:00
fcbaefed52 Nu table (#2015)
* WIP

* Get ready to land nu-table

* Remove unwrap
2020-06-20 15:41:53 +12:00
77e02ac1c1 Fixed grammar (#2012) 2020-06-19 20:54:25 -05:00
088901b24f Rename average to avg 2020-06-19 18:59:00 -05:00
ed7a62bca3 textview config docs (#2011)
* documentation for bat config changes

* renamed to textview, added fetch example

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-19 15:45:56 -05:00
6bfd8532e4 Bat config (#2010)
* WIP - changes to support bat config

* added bat configuration

* removed debug info

* clippy fix

* changed [bat] to [textview]

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-19 15:08:59 -05:00
bc9cc75c8a Minor Math Sum Additions (#2007)
* Move sum tests into math directory

* Move sum documentation over to math documentation

One sum example already existed in the math examples and a few of the others were outdated and didn't work, so I only moved one over, and updated their output

* Remove no-longer-in-use mod statement
2020-06-20 06:00:18 +12:00
53a6e9f0bd Convert sum command into subcommand of the math command (#2004)
* Convert sum command into subcommand of the math command

* Add bullet points to math.md documentation
2020-06-18 21:02:01 -05:00
5f9de80d9b Math#media - ability to compute median value. 2020-06-18 16:59:43 -05:00
353b33be1b Add support to allow the week day start in cal to be configured via a flag (#1996)
* Add support to allow the week day start in cal to be configurable

* Fix variable name

* Use a flag instead of a configuration setting for specifying the starting day of the week
2020-06-19 05:34:51 +12:00
96d58094cf Fix regression. skip-until 'skips' until condition is met. 2020-06-17 14:08:09 -05:00
94aac0e8dd Remove unused pattern matched tag fields. 2020-06-17 13:34:17 -05:00
9f54d238ba Refactoring and more split-by flexibility. 2020-06-17 13:34:17 -05:00
778e497903 Refactoring and more group-by flexibility. 2020-06-17 13:34:17 -05:00
6914099e28 Cat with wings (#1993)
* WIP - Modified textview to use bat crate

* use input_from_bytes_with_name instead of input_file

* removed old paging
added prettyprint on else blocks
duplicated  too much code
hard coded defaults

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-16 16:17:32 -05:00
1b6f94b46c Cal command code cleanup (#1990)
* Cal command code cleanup

* Reverting "index_map" back to "indexmap", since that is the convention used in the system
2020-06-17 08:00:49 +12:00
3d63901b3b Add 'every' command to select (or skip) every nth row (#1992)
* Add 'every' command

* Add --skip option to 'every' command

This option skips instead of selects every nth row

* Fix descriptions for 'every' command

* Add docummentation for 'every' command

* Check actual filenames in 'every' command tests
2020-06-17 07:58:41 +12:00
eb1ada6115 issue1332 - Fix for yamls with unquoted double curly braces (#1988)
* Gnarly hardcoded fix

* Whoops remove println
2020-06-17 07:12:04 +12:00
831111edec Pass the borrow instead of clone. (#1986) 2020-06-16 05:35:24 +12:00
29ea29261d Bump to 0.15.1 (#1984) 2020-06-15 09:54:30 +12:00
ee835f75db Last batch of removing async_stream (#1983) 2020-06-15 09:00:42 +12:00
bd7ac0d48e Math Documentation (#1982)
* Adding math docs

* Add some comments to calculate

* Remove redudant message

Message already shows up in subcommands list

* Added not working example

Accidentantly

* Remove table
2020-06-15 08:42:15 +12:00
d7b1480ad0 Another batch of removing async_stream (#1981) 2020-06-14 11:54:35 +12:00
86b316e930 Another batch of removing async_stream (#1979)
* Another batch of removing async_stream

* Another batch of removing async_stream

* Another batch of removing async_stream
2020-06-14 10:01:44 +12:00
a042f407c1 1882-Add min, max in addition to average for acting lists (#1969)
* Converting average.rs to math.rs

* Making some progress towards math

Examples and unit tests failing, also think I found a bug with passing in strings

* Fix typos

* Found issue with negative numbers

* Add some comments

* Split commands like in split and str_ but do not register?

* register commands in cli

* Address clippy warnings

* Fix bad examples

* Make the example failure message more helpful

* Replace unwraps

* Use compare_values to avoid coercion issues

* Remove unneeded code
2020-06-14 09:49:57 +12:00
40673e4599 Another batch of removing async_stream (#1978) 2020-06-14 07:13:36 +12:00
bcfb084d4c Remove async_stream from some commands (#1976) 2020-06-14 04:30:24 +12:00
a1fd5ad128 Updated Readme to include Roadmap project board. (#1975) 2020-06-13 08:24:30 -05:00
fe6d96e996 Another batch of converting commands away from async_stream (#1974)
* Another batch of removing async_stream

* merge master
2020-06-13 20:43:21 +12:00
e24e0242d1 Removing async_stream! from some more commands (#1973)
* Removing async_stream! from some more commands

* Fix await error

* Fix Clippy warnings
2020-06-13 20:03:13 +12:00
c959dc1ee3 Another batch of removing async_stream (#1972) 2020-06-13 16:03:39 +12:00
d82ce26b44 Another batch of removing async_stream (#1971) 2020-06-13 11:40:23 +12:00
935a5f6b9e Another batch of removing async_stream (#1970) 2020-06-12 20:34:41 +12:00
731aa6bbdd use encoding on open for #1939 (#1949)
* WIP - not compiling

* compiling but panicing

* still broken

* nearly working

* reverted deserializer_string changes
updated enter.rs and open.rs to use Option<Tagged<String>>
Accepted Clippy suggestions
Accepted fmt suggestions
Left original code from open.rs
 We may want to use some of it and only fallback to encoding.

* Don't exit when there is an unknown encoding.

* When encoding is unknown default to utf-8.

* only do encoding if the user says to it

* merged some conflicts on open

* made error messages consistent

* Updated unwrap with expect

* updated open test to pass with more descriptive err
updated enter test to not fail

* change _location to location

* changed _visitor to visitor

* Added a more verbose usage statement for encoding
Linked to docs.rs/encoding_rs for details

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-11 19:37:43 -05:00
a268e825aa Allow config to be readonly (#1967) 2020-06-12 05:50:57 +12:00
982f067d0e Proper precedence history in math (#1966) 2020-06-12 05:17:08 +12:00
a3e1a3f266 Bump start plugin to 0.15.0 (#1956) 2020-06-10 08:39:15 +12:00
e5a18eb3c2 Bump to 0.15.0 (#1955) 2020-06-10 05:33:59 +12:00
16ba274170 Bump heim dependency version. (#1954)
Most important change is a fix for processes CPU usage calculation (see https://github.com/heim-rs/heim/issues/246)
2020-06-10 04:34:05 +12:00
3bb2c9beed Rename env file to .nu-env (#1953) 2020-06-09 15:54:20 +12:00
2fa83b0bbe Pub expose InterruptibleStream and InputStream. (#1952)
This allows crate users to make sure their long-running
streams can be interrupted with ctrl-c.
2020-06-09 05:17:19 +12:00
bf459e09cb WIP: Per directory env-variables (#1943)
* Add args in .nurc file to environment

* Working dummy version

* Add add_nurc to sync_env command

* Parse .nurc file

* Delete env vars after leaving directory

* Removing vals not working, strangely

* Refactoring, add comment

* Debugging

* Debug by logging to file

* Add and remove env var behavior appears correct

However, it does not use existing code that well.

* Move work to cli.rs

* Parse config directories

* I am in a state of distress

* Rename .nurc to .nu

* Some notes for me

* Refactoring

* Removing vars works, but not done in a very nice fashion

* Refactor env_vars_to_delete

* Refactor env_vars_to_add()

* Move directory environment code to separate file

* Refactor from_config

* Restore env values

* Working?

* Working?

* Update comments and change var name

* Formatting

* Remove vars after leaving dir

* Remove notes I made

* Rename config function

* Clippy

* Cleanup and handle errors

* cargo fmt

* Better error messages, remove last (?) unwrap

* FORMAT PLZ

* Rename whitelisted_directories to allowed_directories

* Add comment to clarify how overwritten values are restored.
2020-06-08 19:55:25 +12:00
ec7ff5960d Remove async_stream! from some commands (#1951)
* Remove async_stream! from open.rs

* Ran rustfmt

* Fix Clippy warning

* Removed async_stream! from evaluate_by.rs

* Removed async_stream! from exit.rs

* Removed async_stream! from from_eml.rs

* Removed async_stream! from group_by_date.rs

* Removed async_stream! from group_by.rs

* Removed async_stream! from map_max.rs

* Removed async_stream! from to_sqlite.rs

* Removed async_stream! from to_md.rs

* Removed async_stream! from to_html.rs
2020-06-08 16:48:10 +12:00
545f70705e ISSUE-1907 Disallow invalid top level TOML (#1946)
* Do not allow invalid top-level toml

Move recursive toml conversion into a helper func

* Forgot to format

* Forgot to use helper inside collect values

Added some additional tests
2020-06-08 08:02:37 +12:00
48672f8e30 Assign variables when passed as an argument. (#1947) 2020-06-08 04:15:57 +12:00
160191e9f4 Cal updates (#1945)
* Clean up `use` statements

* Update cal code to be ready for future data coloring
2020-06-07 15:52:42 +12:00
bef9669b85 When the nushell is located in a path that has a space in it, these tests break, this fixes it (#1944) 2020-06-07 15:50:52 +12:00
15e66ae065 Implement an option to show paths made of mkdir. (#1932) 2020-06-06 15:13:38 -04:00
ba6370621f Removing async_stream! from some commands (#1940)
* Removing async_stream! from some commands

* Revert row.rs code

* Simplify logic for first.rs and skip.rs
2020-06-06 19:42:06 +12:00
2a8ea88413 Bring back parse as built-in. 2020-06-04 15:21:13 -05:00
05959d6a61 Bump to latest rustyline (#1937) 2020-06-05 05:50:12 +12:00
012c99839c Moving some commands off of async stream (#1934)
* Remove async_stream from rm

* Remove async_stream from sort_by

* Remove async_stream from split_by

* Remove dbg!() statement

* Remove async_stream from uniq

* Remove async_stream from mkdir

* Don't change functions from private to public

* Clippy fixes

* Peer-review updates
2020-06-04 20:42:23 +12:00
5dd346094e Cut out a function to generate a pharase in the Flags section. (#1930) 2020-06-04 19:09:43 +12:00
b6f9d0ca58 Remove no_auto_pivot (#1931)
no_auto_pivot does not exist any longer. It was rolled into pivot_mode.
2020-06-03 12:27:54 -04:00
ae72593831 changed to-float to to-decimal (#1926)
* changed to-float to to-decimal

* changed to-float to to-decimal
2020-06-02 09:02:57 +12:00
ac22319a5d Update Cargo.lock (#1923) 2020-06-02 04:32:24 +12:00
7e8c84e394 Bump actions/checkout version from v1 to v2. (#1924) 2020-06-02 04:31:48 +12:00
ef4eefa96a Bump more deps (#1921) 2020-05-31 08:54:47 +12:00
2dc43775e3 Bump to latest heim (#1920)
* Bump to latest heim

* Fix pinning issue
2020-05-31 08:54:33 +12:00
4bdf27b173 Batch of moving commands off async_stream #3 (#1919)
* Batch of moving commands off async_stream #3

* remove commented-out section

* merge master
2020-05-31 06:31:50 +12:00
741d7b9f10 Add rm_always_trash option to config (#1869)
* Add `rm_always_trash` option to config

* Add `--permanent` flag to `rm`

* `rm`: error if both `-t` and `-p` are present

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-05-31 06:31:34 +12:00
ecb67fee40 ISSUE-1744-Glob support for start command (#1912)
* Possible implementation of globbing for start command

* Whoops forgot to remove Error used for debugging

* Use string lossy

* Run clippy

* Pin glob

* Better error messages

* Remove unneeded comment
2020-05-31 05:41:25 +12:00
ad43ef08e5 Support average for tables. 2020-05-30 10:33:09 -05:00
092ee127ee Batch of moving commands off async_stream (#1917) 2020-05-30 16:34:39 +12:00
b84ff99e7f Batch of moving commands off async_stream (#1916) 2020-05-30 11:36:04 +12:00
3a6a3d7409 Implement login for the fetch command (#1915) 2020-05-30 11:22:38 +12:00
48ee20782f Ensure end_filter plugin lifecycle stage gets called. 2020-05-29 04:03:25 -05:00
360e8340d1 Move run to be async (#1913) 2020-05-29 20:22:52 +12:00
fbc0dd57e9 Add plugin_dir to docs (#1911) 2020-05-29 08:46:27 +12:00
3f9871f60d Simplify plugin directory scanning (#1910) 2020-05-29 07:14:32 +12:00
fe01a223a4 Str plugin promoted to built-in Nu command. 2020-05-28 11:18:58 -05:00
0a6692ac44 Simplify parse plugin code. (#1904)
Primarily, instead of building a parse pattern enum, we just build the regex
directly, with the appropriate capture group names so that the column name
codepaths can be shared between simple and `--regex` patterns.

Also removed capture group count compared to column name count. I don't think
this codepath can possibly be reached with the regex we now use for the
simplified capture form.
2020-05-28 09:58:06 -04:00
98a3d9fff6 Allow echo to iterate ranges (#1905) 2020-05-28 06:07:53 +12:00
e2dabecc0b Make it-expansion work when in a list (#1903) 2020-05-27 20:29:05 +12:00
49b0592e3c Implement ctrl+c for the du command (#1901)
* Implement ctrl+c for the du command

* Ignore the "too many arguments" Clippy warning
2020-05-27 16:52:20 +12:00
fa812849b8 Fix warnings and split Scope (#1902) 2020-05-27 16:50:26 +12:00
9567c1f564 Fix for inconsistency when quoted strings are used with with_env shorthand (#1900) 2020-05-26 15:03:55 -04:00
a915471b38 Cal documentation updates (#1895) 2020-05-26 07:21:36 -04:00
bf212a5a3a change the test to use the origin column (#1878) 2020-05-25 18:50:54 -04:00
f0fc9e1038 Merge pivot options (#1888) 2020-05-25 18:40:25 -04:00
cb6ccc3c5a Improve the simplified parse form. (#1875) 2020-05-25 14:19:49 -04:00
07996ea93d Remove as many of the typecasts as possible in the cal command (#1886)
* Remove as many of the typecasts as possible in the cal command

* Run rustfmt on cal.rs
2020-05-25 18:51:23 +12:00
c71510c65c Update CODE_OF_CONDUCT.md 2020-05-25 18:50:14 +12:00
9c9941cf97 Update CODE_OF_CONDUCT.md 2020-05-25 18:49:36 +12:00
005d76cf57 Fix broken ordering of args when parsing command with env vars. (#1841) 2020-05-24 19:26:27 -04:00
8a99d112fc Add --to-float to str plugin (#1872) 2020-05-24 18:11:49 -04:00
fb09d7d1a1 docs: add alias --save flag (#1874) 2020-05-24 13:42:20 -04:00
9c14fb6c02 Show error when trying to sort by invalid column (#1880)
* Show error when trying to sort by invalid column

* Added test for changes

* Addressed comments, updated test

* Removed unnecessary mutable keyword

* Changed split-column to solt column after rebase from upstream
2020-05-25 05:37:08 +12:00
d488fddfe1 Add useful commands to CONTRIBUTING.md (#1865)
* Add useful commands to CONTRIBUTING.md

* Add some formatting commands
2020-05-25 05:34:26 +12:00
e1b598d478 added examples and explanation to trim (#1876)
* added examples and explanation to trim

inserted examples (taken from parse in cookbook) for lists and tables using trim

* Move `to-json` to `to json`

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-05-25 05:10:30 +12:00
edbecda14d Split split command to sub commands. 2020-05-24 02:02:44 -05:00
74c2daf665 Add completion for binaries on PATH (#1866) 2020-05-23 20:27:52 -04:00
aadbcf5ce8 Issue 1787 (#1827) 2020-05-23 20:08:39 -04:00
460daf029b Add space to bottom of table in 'light' mode (#1871) 2020-05-22 21:12:26 -04:00
9e6ab33fd7 Add --regex flag to parse (#1863) 2020-05-22 10:13:58 -04:00
5de30d0ae5 Tweak auto-rotate for single row output (#1861)
* added helper to convert data to strings
added ability to auto-rotate single row output
if row will be greater than terminal width

* Added pivot_to_fit config value

* Added ColumnPath to convert_to_string helper

* Figured out I had to run `cargo fmt --all -- --check`

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-05-22 04:30:58 +12:00
97b9c078b1 Fix completer handling of paths with spaces (#1858)
* Fix completer handling of paths with spaces

* Satisfy Clippy for completer

* Satisfy cargo fmt for completer
2020-05-21 08:32:21 +12:00
8dc5c34932 Save alias (#1852)
* figuring out error with lines

* make progress in printing of block

* support for external commands; fix some tiny bugs in formatting

* basic printing of block; going to experiment with bubbling raw input to the command itself to avoid potential edge cases

* remove fmt::Display impls for hir structs; bubbled raw_input to command args

* compiling checkpoint :)

* process raw input alias to remove save flag; do duplicates stored

* fix warnings; run clippy

* removed tmux log file

* fix bug in looking for same alias; changed unwraps to safe unwraps
2020-05-21 05:31:04 +12:00
3239e5055c Added a count column on the histogram command (#1853)
* Adding iniitial draft for the addition of the count column on the histogram command

* Update histogram documentation

* Add count column test to histogram command

* Fix error in histogram documentation
2020-05-20 18:02:36 +12:00
b22db39775 Progress readme (#1854)
* Add some progress indicators to the readme

* Add some progress indicators to the readme
2020-05-20 16:46:55 +12:00
7c61f427c6 Update minimum Rust version requirement in README (#1851)
a4c1b092ba introduced uses of `map_or` on `Result`, which was only stabilised in Rust 1.41.
2020-05-20 10:55:46 +12:00
ae8c864934 default history size to 100k (#1845) 2020-05-20 07:28:06 +12:00
ed80933806 String interpolation (#1849)
* Add string interpolation

* fix coloring

* A few more fixups + tests

* merge master again
2020-05-20 07:27:26 +12:00
ae87582cb6 Fix missing invocation errors (#1846) 2020-05-19 08:57:25 -04:00
b89976daef let format access variables also (#1842) 2020-05-19 16:20:09 +12:00
76b170cea0 Update test command (#1840) 2020-05-18 22:01:27 -04:00
Sam
3302586379 added documentation for no_auto_pivot (#1828) 2020-05-18 21:21:35 -04:00
3144dc7f93 add support for specifying max history size in config (#1829) (#1837) 2020-05-19 10:27:08 +12:00
6efabef8d3 Remove interpretation of Primitive::Nothing as the number 0. (#1836) 2020-05-18 15:18:46 -04:00
0743b69ad5 Move from language-reporting to codespan (#1825) 2020-05-19 06:44:27 +12:00
5f1136dcb0 Fix newly added examples. (#1830) 2020-05-18 11:40:44 -04:00
acf13a6fcf Add (near) automatic testing for command examples (#1777) 2020-05-18 08:56:01 -04:00
Sam
3fc4a9f142 added config check to disable auto pivoting of single row outputs (#1791)
* added config check to disable auto pivoting of single row outputs

* fixed change to use built-in boolean values
2020-05-18 20:42:22 +12:00
1d781e8797 Add docs for the shuffle command (#1824) 2020-05-18 19:13:03 +12:00
b6cdfb1b19 Remove the -n flag from shuffle (#1823) 2020-05-18 19:12:35 +12:00
334685af23 Add some examples (#1821)
* Adds some examples

* Run rustfmt

* Fixed a few descriptions in the examples
2020-05-18 19:11:37 +12:00
c475be2df8 Fix starship not getting the correct pwd (#1822) 2020-05-18 17:22:54 +12:00
6ec6eb5199 Call external correctly. 2020-05-17 23:32:55 -05:00
f18424a6f6 Remove test-bins feature. 2020-05-17 23:32:55 -05:00
d1b1438ce5 Check capture group count (#1814) 2020-05-17 14:52:17 -04:00
af6aff8ca3 Allow user to specify the indentation setting on the pretty flag for the to json command (#1818)
* Allow user to specify the indentation setting on the pretty flag for the to json command

* Use "JSON" over "json"
2020-05-18 06:48:58 +12:00
d4dd8284a6 create Palette trait (#1813)
* create Pallet trait

* correct spelling to palette

* move palette to it's own module
2020-05-18 05:48:57 +12:00
e728cecb4d add docs for start command (#1816) 2020-05-18 05:37:15 +12:00
41e1aef369 Fix the insert command (#1815) 2020-05-17 08:30:52 -04:00
e50db1a4de Adds support to pretty format the json in to json (#1812)
* Current work on the --pretty flag for to json

* Deleted notes that were pushed by accident

* Fixed some errors
2020-05-17 16:43:10 +12:00
41412bc577 Update issue templates 2020-05-17 11:30:09 +12:00
e12aa87abe Update issue templates 2020-05-17 11:28:14 +12:00
0abc94f0c6 Bump some of our dependencies (#1809) 2020-05-17 10:34:10 +12:00
48d06f40b1 Remove the old it-hacks from fetch and post (#1807) 2020-05-17 06:18:46 +12:00
f43ed23ed7 Fix parsing of invocations with a dot (#1804) 2020-05-16 19:25:18 +12:00
40ec8c41a0 Cal command updates (#1758)
* Calculate the quarter directly

* Group some data together, remove attribute to ignore Clippy warning

* Group items into structs and add methods

* Updates to cal command

* Update cal.rs

* Update cal.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-05-16 16:00:06 +12:00
076fde16dd Evaluation of command arguments (#1801)
* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* Finish adding the baseline refactors for argument invocation

* Finish cleanup and add test

* Add missing plugin references
2020-05-16 15:18:24 +12:00
Sam
822440d5ff added nothing value for incalcuable dir sizes (#1789) 2020-05-15 12:53:18 -04:00
fc8ee8e4b9 Extracted grouping by date to it's own subcommand. (#1792) 2020-05-15 04:16:09 -05:00
5fbe5cf785 Use the directories crate instead of app_dirs (#1782)
The app_dirs crate is abandonned since quite a bit of time. Use the directories
crate instead, which is maintained and have more OS support.
2020-05-14 20:17:23 +12:00
f78536333a doc: add rename command page (#1781) 2020-05-14 12:36:08 +12:00
e7f08cb21d Allow external binary to register custom commands. (#1780)
This changeset contains everything that a separate binary needs to
register its own commands (including the new help function). It is
very possible that this commit misses other pub use exports, but
the contained ones work for our use cases so far.
2020-05-14 12:35:22 +12:00
f5a1d2f4fb Update README.md 2020-05-14 05:37:18 +12:00
8abbfd3ec9 Update README examples (#1779) 2020-05-14 05:36:24 +12:00
6826a9aeac doc: add more from command pages (#1778)
* doc: add from-url command page

* doc: add missing link to existing from-xml page.

* doc: add from-ini command page
2020-05-14 05:23:33 +12:00
e3b7e47515 cal: fix thursday typo (#1776) 2020-05-13 08:06:31 -04:00
196991ae1e Bump to 0.14.1 (#1772) 2020-05-13 20:03:45 +12:00
34b5e5c34e doc: add headers command page (#1775) 2020-05-13 20:03:29 +12:00
cb24a9c7ea doc: rename edit command to update (#1774) 2020-05-13 20:02:41 +12:00
9700b74407 Fix type in config flag description (#1769) 2020-05-13 14:21:57 +12:00
803c6539eb doc: fix nth examples (#1768) 2020-05-12 16:47:45 -04:00
75edcbc0d0 Simplify mv in FilesystemShell (#1587) 2020-05-12 16:40:45 -04:00
b2eecfb110 Bump to 0.14 (#1766) 2020-05-13 04:32:51 +12:00
b0aa142542 Add examples for some more commands (#1765) 2020-05-13 03:54:29 +12:00
247d8b00f8 doc: fix prepend example definition (#1761)
It seems that the description was copy-pasted by mistake from the
append command.
2020-05-12 19:46:21 +12:00
0b520eeaf0 Add a batch of help examples (#1759) 2020-05-12 17:17:17 +12:00
c3535b5c67 It-expansion fixes (#1757)
* It-expansion fixes

* fix clippy
2020-05-12 15:58:16 +12:00
8b9a8daa1d Add a batch of help examples (#1755) 2020-05-12 13:00:55 +12:00
c5ea4a31bd Adding coloring to help examples (#1754) 2020-05-12 11:06:40 +12:00
2275575575 Improve list view and ranges (#1753) 2020-05-12 08:06:09 +12:00
c3a066eeb4 Add examples to commands (#1752)
* Pass &dyn WholeStreamCommand to get_help

* Add an optional example to the WholeStreamCommand trait

* Add an example to the alias command
2020-05-12 08:05:44 +12:00
42eb658c37 Add a simplified list view (#1749) 2020-05-11 14:47:27 +12:00
a2e9bbd358 Improve duration math and printing (#1748)
* Improve duration math and printing

* Fix test
2020-05-11 13:44:49 +12:00
8951a01e58 update cal documentation (#1746) 2020-05-11 13:19:14 +12:00
f702aae72f Don't include year and month by default, adds an option to display th… (#1745)
* Don't include year and month by default, adds an option to display the quarters of the year

* Add a test for cal that checks that year requested is in the return

* rustfmt the cal tests
2020-05-11 12:35:24 +12:00
f5e03aaf1c Add cal command (#1739)
* Add cal command

* Fix docmentation to show commands on two lines

* Use bullet points on flag documentation for cal

* Dereference day before calling string method

* Silence Clippy warning regarding a function with too many arguments

* Update cal flag descriptions and documentation

* Add some tests for the cal command
2020-05-10 11:05:48 +12:00
0f0847e45b Move 'start' to use ShellError (#1743)
* Move 'start' to use ShellError

* Remove unnecessary changes

* Add missing macOS change

* Add default

* More fixed

* More fixed
2020-05-10 08:08:53 +12:00
ccd5d69fd1 Bug fix start (#1738)
* fix bug on linux; added start to the stable list

* add to stable and fix clippy lint
2020-05-10 05:28:57 +12:00
55374ee54f Fix help text for alias command. (#1742)
* Fix help text for alias command.

* Rust fmt
2020-05-09 12:16:14 -05:00
f93ff9ec33 Make grouping more flexible. (#1741) 2020-05-09 12:15:47 -05:00
9a94b3c656 start command in nushell (#1727)
* mvp for start command

* modified the signature of the start command

* parse filenames

* working model for macos is done

* refactored to read from pipes

* start command works well on macos; manual testing reveals need of --args flag support

* implemented start error; color printing of warning and errors

* ran clippy and fixed warnings

* fix a clippy lint that was caught in pipeline

* fix dead code clippy lint for windows

* add cfg annotation to import
2020-05-09 06:19:48 +12:00
e04b89f747 [Gitpod] Refactor Gitpod configuration and add full Debugging support (#1728) 2020-05-09 05:41:24 +12:00
180c1204f3 Use playground instead of depending on fixture format files. (#1726) 2020-05-07 06:58:35 -05:00
96e5fc05a3 Pick->Select rename. Integration tests changes. (#1725)
Pick->Select rename. Integration tests changes.
2020-05-07 06:03:43 -05:00
c3efdf2689 Rename edit command to update. (#1724)
Rename edit command to update.
2020-05-07 00:33:30 -05:00
27fdef5479 Read exit status before failing in failed read from stdout pipe (#1723) 2020-05-07 13:42:01 +12:00
7ce8026916 Ignore empty arguments passed to externals (#1722) 2020-05-07 09:18:56 +12:00
8a9fc6a721 Fix changing to a new Windows drive (#1721)
* Fix changing to a new Windows drive

* Update cli.rs
2020-05-07 05:51:03 +12:00
c06a692709 Bash-like shorthand with-env (#1718)
* Bash-like shorthand with-env

* fix clippy warning
2020-05-06 18:57:37 +12:00
b37e420c7c Add with-env command (#1717) 2020-05-06 15:56:31 +12:00
22e70478a4 docs/commands: add to.md, update subcommands (#1715)
This adds a top-level document for the new `to` command, with a list (of links) of all the subcommands.

All the to-* subcommands keep their filename, but the content is updated to use the new subcommand syntax.

Since not all subcommands have documentation, some items in the list are just text without a link. Also filled the list for the undocumented from* commands in the same style.

Fixes #1709
2020-05-05 20:05:23 +12:00
8ab2b92405 docs/commands: add from.md, update subcommands (#1712)
This adds a top-level document for the new `from` command, with a list of links of all the subcommands.

All the from-* subcommands keep their filename, but the content is updated to use the new subcommand syntax.

Needs matching update for to*

Ref #1709
2020-05-05 09:01:31 +12:00
3201c90647 Extend to/from usage text to indicate subcommands (#1711)
Both to and from without a subcommand only print the helptext. Expand the usage line a bit, so a glance at `help commands` indicates the existance of the subcommands and mentions some common formats.

Ref a9968046ed
Ref #1708
2020-05-05 09:00:29 +12:00
454f560eaa Properly deserialize history args (#1710) 2020-05-05 07:50:10 +12:00
d2ac506de3 Changes to allow plugins to be loaded in a multi-threaded manner (#1694)
* Changes to allow plugins to be loaded in a multi-threaded manner in order to decrease startup time.

* Ran rust fmt and clippy to find and fix first pass errors.
Updated launch.jason to make debugging easier in vscode.
Also added tasks.json so tasks like clippy can be ran easily.

* ran fmt again

* Delete launch.json

Remove IDE settings file

* Remove IDE settings file

* Ignore vscode IDE settings

* Cloned the context instead of Arc/Mutexing it.

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-05-05 06:15:24 +12:00
a9968046ed Add subcommands. Switch from-* and to-* to them (#1708) 2020-05-04 20:44:33 +12:00
453087248a Properly drain iterating pipe so we can see errors (#1707) 2020-05-04 15:29:32 +12:00
81ff598d6c Fix column bugs associated with previous refactoring (#1705)
* Fix: the symlink target column will only dispaly if either the `full` or `with_symlink_targets` options are given

* If the metadata for every item in the size column is None, do not show the size column

* Do not show the symlink target column if the metadata is None for all the items in the table
2020-05-04 14:58:11 +12:00
d7d487de73 Basic documentation for the wrap command (#1704) 2020-05-04 04:54:21 +12:00
8d69c77989 Display either dir metadata size or dir apparent size in ls (#1696)
* Show dir size in ls command

* Add the option to show the apparent directory size via `ls --du`
2020-05-03 17:09:17 +12:00
0779a46179 docs/commands: add alias.md (#1697)
* docs/commands: add alias.md

* docs/commands/alias: drop reference to bash
2020-05-03 16:49:27 +12:00
ada92f41b4 Parse file size (byte) units in all mixes of case (#1693) 2020-05-02 14:14:07 -04:00
ef3049f5a1 Reduce some repetitive code in files.rs (#1692) 2020-05-02 11:14:29 -04:00
1dab82ffa1 Gitignore JetBrains' IDE items (#1691) 2020-05-01 09:43:53 +12:00
e9e3fac59d Remove bin.is_file() because it's expensive (#1689)
This one change takes the startup time from 2.8 seconds to 1.2 seconds in my testing on Windows.
2020-05-01 06:43:59 +12:00
7d403a6cc7 Escape some symbols in external args (#1687)
* Escape some symbols in external args

* Don't escape on Windows, which does its own

* fix warning
2020-04-30 16:54:07 +12:00
cf53264438 Table operating commands. (#1686)
* Table operating commands.

* Updated merge test for clarity.

* More clarity.

* Better like this..
2020-04-29 23:18:24 -05:00
d834708be8 Finish remove the last traces of per-item (#1685) 2020-04-30 14:23:40 +12:00
f8b4784891 Add .DS_Store to .gitignore (#1684) 2020-04-30 13:12:18 +12:00
789b28ac8a Convert if expression to match (#1683) 2020-04-30 12:59:50 +12:00
db8219e798 extend it-expansion to externals (#1682)
* extend it-expansion to externals

* trim the carriage return for external strings
2020-04-30 07:09:14 +12:00
73d5310c9c make it-expansion work through blocks when necessary (#1681) 2020-04-29 19:51:46 +12:00
8d197e1b2f Show symlink sizes in ls (#1680)
* Show symlink sizes in ls

* Reduce redundancy in the size code of ls
2020-04-29 19:23:26 +12:00
c704157bc8 Replace ichwh with which and some fixes for auto-cd (canonicalization) (#1672)
* fix: absolutize path against its parent if it was a symlink.

On Linux this happens because Rust calls readlink but doesn't canonicalize the resultant path.

* feat: playground function to create symlinks

* fix: use playground dirs

* feat: test for #1631, shift tests names

* Creation of FilesystemShell with custom location may fail

* Replace ichwh with which

* Creation of FilesystemShell with custom location may fail

* Replace ichwh with which

* fix: add ichwh again since it cannot be completely replaced

* fix: replace one more use of which
2020-04-28 05:49:53 +12:00
6abb9181d5 bump to 0.13.1 (#1670) 2020-04-27 18:50:14 +12:00
006171d669 Fix per-item run_alias (#1669)
* Fix per-item run_alias

* Fix 1609 also
2020-04-27 18:10:34 +12:00
8bd3cedce1 It expansion (#1668)
* First step in it-expansion

* Fix tests

* fix clippy warnings
2020-04-27 14:04:54 +12:00
6f2ef05195 Resolves https://github.com/nushell/nushell/issues/1658 (#1660)
For example, when running the following:

    crates/nu-cli/src

nushell currently parses this as an external command. Before running the command, we check to see if
it's a directory. If it is, we "auto cd" into that directory, otherwise we go through normal
external processing.

If we put a trailing slash on it though, shells typically interpret that as "user is explicitly
referencing directory". So

    crates/nu-cli/src/

should not be interpreted as "run an external command". We intercept a trailing slash in the head
position of a command in a pipeline as such, and inject a `cd` internal command.
2020-04-27 13:22:01 +12:00
80025ea684 Rows and values can be checked for emptiness. Allows to set a value if desired. (#1665) 2020-04-26 12:30:52 -05:00
a62745eefb make trim apply to strings contained in tables (also at deeper nesting (#1664)
* make trim apply to strings contained in tables (also at deeper nesting
levels), not just top-level strings

* remove unnecessary clone (thanks clippy)
2020-04-27 05:26:02 +12:00
2ac501f88e Adds drop number of rows command (#1663)
* Fix access to columns with quoted names

* Add a drop number of rows from end of table
2020-04-26 18:34:45 +12:00
df90d9e4b6 Fix access to columns with quoted names (#1662) 2020-04-26 18:01:55 +12:00
ad7a3fd908 Add not-in: operator (#1661) 2020-04-26 17:32:17 +12:00
ad8ab5b04d Add from-eml command (#1656)
* from-eml initial ver

* Adding tests for `from-eml`

* Add eml to prepares_and_decorates_filesystem_source_files

* Sort the file order

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-04-26 16:26:35 +12:00
e7767ab7b3 suppress the parser error for an insufficient number of required arguments if the named keyword argument 'help' is given (#1659) 2020-04-26 04:45:45 +12:00
846a779516 Fix cd'ing to symlinked directories (#1651)
* fix: absolutize path against its parent if it was a symlink.

On Linux this happens because Rust calls readlink but doesn't canonicalize the resultant path.

* feat: playground function to create symlinks

* fix: use playground dirs

* feat: test for #1631, shift tests names
2020-04-25 18:09:00 +12:00
e7a4f31b38 Docs: Mention how to use str --find-replace in the docs. (#1653)
Co-authored-by: Christoph Siedentop <christoph@siedentop.name>
2020-04-25 18:07:38 +12:00
10768b6ecf str plugin can capitalize and trim strings. (#1652)
* Str plugin can capitalize.

* Str plugin can trim.
2020-04-24 16:37:58 -05:00
716c4def03 Add key_timeout config option (#1649) 2020-04-25 05:28:38 +12:00
0e510ad42b Values can be used as keys for grouping. (#1648) 2020-04-23 20:55:18 -05:00
3c222916c6 [docs]: How to run tests. (#1647)
Co-authored-by: Christoph Siedentop <christoph@siedentop.name>
2020-04-24 12:20:55 +12:00
6887554888 [docs/enter] Warn about enter opening multiple shells (#1645)
Opening a JSON with a top-level list, opens one shell per list element. This can be extremely confusing to unexpected users.
2020-04-24 12:17:11 +12:00
d7bd77829f Bump rustyline (#1644)
* Bump rustyline

* Fix new clippy warnings

* Add pipeline command
2020-04-24 08:00:49 +12:00
9e8434326d Update azure-pipelines.yml 2020-04-24 07:23:56 +12:00
27bff35c79 Update azure-pipelines.yml 2020-04-24 07:21:27 +12:00
e2fae63a42 Update azure-pipelines.yml 2020-04-24 07:04:10 +12:00
701711eada Be more resilient with startup lines (#1642) 2020-04-23 17:22:01 +12:00
9ec2aca86f Support completion for paths with multiple dots (#1640)
* refactor: expand_path and expand_ndots now work for any string.

* refactor: refactor test and add new ones.

* refactor: convert expanded to owned string

* feat: pub export of expand_ndots

* feat: add completion for ndots in fs-shell
2020-04-23 16:17:38 +12:00
818171cc2c Make default completion mode OS dependent (#1636) 2020-04-23 10:53:40 +12:00
b3c623396f Fix to-csv command (#1635)
* Fix --headerless option of to-csv and to-tsv

Before to-csv --headerless split the "headerfull" output on lines,
skipped the first, and then concatenated them together. That meant
that all remaining output would be put on the same line, without
any delimiter, making it unusable. Now we replace the range of the
first line with the empty string, maintaining the rest of the
output unchanged.

* Remove extra space in indentation of to_delimited_data

* Add --separator <string> argument to to-csv

This functionaliy has been present before, but wasn't exposed
to the command.
2020-04-23 05:08:53 +12:00
88f06c81b2 Update Cargo.toml 2020-04-22 06:34:32 +12:00
e6b315f05b Bump ichwh to version 0.3.4 (#1627)
Fixes #1626 and [this ichwh issue](https://gitlab.com/avandesa/ichwh-rs/-/issues/3)
2020-04-22 05:37:14 +12:00
01ef6b0732 Add config to disable table index column (#1623) 2020-04-21 18:09:22 +12:00
c7e11a5a28 bump to 0.13.0 (#1625) 2020-04-21 17:01:03 +12:00
ce0231049e Fix the panic running a bad alias (#1624) 2020-04-21 16:11:34 +12:00
0f7b270740 Add exit code verification (#1622) 2020-04-21 15:14:18 +12:00
72cf57dd99 Add booleans and fix semicolon shortcircuit (#1620) 2020-04-21 12:30:01 +12:00
e4fdb36511 External vars (#1615)
* fix empty table and missing spans

* wip

* WIP

* WIP

* working version with vars

* tidying

* WIP

* Fix external quoting issue
2020-04-21 09:45:11 +12:00
2ffb14c7d0 fix empty table and missing spans (#1614) 2020-04-20 19:44:19 +12:00
eec94e4016 Semicolon (#1613)
* WIP on blocks

* Getting further

* add some tests
2020-04-20 18:41:51 +12:00
6412bfd58d Move alias arguments to optional arguments (#1612)
This will allow people to use more variables to capture more, if necessary, without having to implement required/optional/rest
2020-04-20 16:04:40 +12:00
522a828687 Move external closer to internal (#1611)
* Refactor InputStream and affected commands.

First, making `values` private and leaning on the `Stream` implementation makes
consumes of `InputStream` less likely to have to change in the future, if we
change what an `InputStream` is internally.

Second, we're dropping `Option<InputStream>` as the input to pipelines,
internals, and externals. Instead, `InputStream.is_empty` can be used to check
for "emptiness". Empty streams are typically only ever used as the first input
to a pipeline.

* Add run_external internal command.

We want to push external commands closer to internal commands, eventually
eliminating the concept of "external" completely. This means we can consolidate
a couple of things:

- Variable evaluation (for example, `$it`, `$nu`, alias vars)
- Behaviour of whole stream vs per-item external execution

It should also make it easier for us to start introducing argument signatures
for external commands,

* Update run_external.rs

* Update run_external.rs

* Update run_external.rs

* Update run_external.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-04-20 15:30:44 +12:00
6b8c6dec0e Bump ichwh minimum version to 0.3.3 (#1607)
The new version includes a fix for [this issue]

[this issue]: https://gitlab.com/avandesa/ichwh-rs/-/issues/2
2020-04-19 13:38:14 +12:00
2b0212880e Simplify cp command and allow multiple recursive copying (#1580)
* Better message error

* Use custom canonicalize in FileStructure build

* Better glob error in ls

* Use custom canonicalize, remove some duplicate code in cd.

* Enable recursive copying with patterns.

* Change test to fit new error message

* Test recursive with glob pattern

* Show that not matches were found in cp

* Fix typo in message error

* Change old canonicalize usage, follow newest changes
2020-04-19 11:05:24 +12:00
a16a91ede8 Fix precedence parsing of parens. Limit use (#1606) 2020-04-19 06:39:06 +12:00
c2a9bc3bf4 Add better docs to parser (#1604) 2020-04-19 05:30:40 +12:00
e5a79d09df Fix the versions up a bit to prevent breakage (#1602)
* Fix the versions up a bit to prevent breakage

* fix up the quickcheck test to not fail on parse error
2020-04-18 15:31:57 +12:00
7974e09eeb Math operators (#1601)
* Add some math operations

* WIP for adding compound expressions

* precedence parsing

* paren expressions

* better lhs handling

* add compound comparisons and shorthand lefthand parsing

* Add or comparison and shorthand paths
2020-04-18 13:50:58 +12:00
52d2d2b888 A random set of fixes (#1600) 2020-04-17 18:19:49 +12:00
ee778d2b03 WIP fix for the error bubbling (#1597) 2020-04-16 16:25:24 +12:00
928188b18e Redesign custom canonicalize, enable multiple dots expansion earlier. (#1588)
* Expand n dots early where tilde was also expanded.

* Remove normalize, not needed.
New function absolutize, doesn't follow links neither checks existence.
Renamed canonicalize_existing to canonicalize, works as expected.

* Remove normalize usages, change canonicalize.

* Treat strings as paths
2020-04-16 11:29:22 +12:00
59d516064c Add alias support to scripts and -c (#1593) 2020-04-16 05:50:35 +12:00
bd5836e25d Aliases (#1589)
* WIP getting scopes right

* finish adding initial support

* Finish with alias and add startup commands
2020-04-15 17:43:23 +12:00
e3da037b80 Canonical expr block (#1584)
* Add the canonical form for an expression block

* Remove commented out code
2020-04-14 06:59:33 +12:00
08a09e2273 Pipeline blocks (#1579)
* Making Commands match what UntaggedValue needs

* WIP

* WIP

* WIP

* Moved to expressions for conditions

* Add 'each' command to use command blocks

* More cleanup

* Add test for 'each'

* Instead use an expression block
2020-04-13 19:59:57 +12:00
85d6b24be3 Add more cmd builtins (#1583) 2020-04-13 13:46:59 +12:00
ed583bd79b Bump to latest ichwh (#1582) 2020-04-13 08:01:23 +12:00
e0fc09ac52 move rustyline to 6.1.1 to fix windows crash (#1581) 2020-04-13 07:01:44 +12:00
38b2846024 Split canonicalize function in two for missing and existing behavior (#1576)
* Split allow missing logic in two functions

* Replace use of old canonicalize
2020-04-12 20:33:38 +12:00
57c62de66f Remove erronous GPL license header (#1578) 2020-04-12 20:16:50 +12:00
dd4935fb23 Add quickcheck (#1574)
* Move uptime to being a duration value

* Adds our first quickcheck test
2020-04-12 07:05:59 +12:00
18dd009ca8 Unified path expansion under new module and better canonicalize (#1571)
* New 'path' module under nu-cli.
Added normalize and canonicalize method.
Added some unit tests.

* Replace old usages of normalize and canonicalize.

* Fix reading symlinks and existence logic.

* Better explained
2020-04-12 07:05:29 +12:00
c0dda36217 Update CONTRIBUTING.md 2020-04-12 06:54:16 +12:00
75b72f844e Create CONTRIBUTING.md (#998)
* Create CONTRIBUTING.md

* mention gitpod

* Update CONTRIBUTING.md

* mention community supports

* fix capitalization issues
2020-04-12 06:52:53 +12:00
fbddc12c02 Move uptime to being a duration value (#1573) 2020-04-11 19:40:56 +12:00
8e7e8c17e1 Make trash support optional (#1572) 2020-04-11 18:53:53 +12:00
8ac9d781fd Remove source text where not needed (#1567) 2020-04-10 19:56:48 +12:00
c86cf31aac some minor improvements and removing dead code (#1563) 2020-04-10 07:48:10 +12:00
2c513d1883 More dep bumps (#1562) 2020-04-09 10:28:20 +12:00
04702530a3 Bump a lot of deps (#1560) 2020-04-07 19:51:17 +12:00
c9f424977e actually bump version (#1559) 2020-04-07 07:18:47 +12:00
183c8407de fix nu variable. tweak shells (#1558) 2020-04-07 05:30:54 +12:00
d0618b0b32 Enable the use of multiple dots in FS Shell commands (#1547)
Every dot after `..` means another parent directory.
2020-04-06 07:28:56 -04:00
c4daa2e40f Add experimental new parser (#1554)
Move to an experimental new parser
2020-04-06 19:16:14 +12:00
0a198b9bd0 Have FilesystemShell#rm correctly delete symlinks (#1550) 2020-04-05 17:17:52 -04:00
6a604491f5 ci(docker-publish): force remove non-executable (#1540) 2020-04-02 04:20:18 +13:00
791f7dd9c3 Bump to 0.12.0 (#1538) 2020-04-01 06:25:21 +13:00
a4c1b092ba Add configurations for table headers (#1537)
* Add configurations for table headers

* lint

Co-authored-by: Amanita-Muscaria <nope>
2020-03-31 12:19:48 +13:00
6e71c1008d Change get to remove blanks (#1534)
Remove blank values when getting a column of values
2020-03-30 15:36:21 +13:00
906d0b920f A few improvements to du implementation: (#1533)
1. Fixed a bug where `--all` wasn't showing files at the root directories.
2. More use of `Result`'s `map` and `map_err` methods.
3. Making tables be homogeneous so one can, for example, `get directories`.
2020-03-29 21:16:09 -04:00
efbf4f48c6 Fix poor message for executable that user doesn't have permissi… (#1535)
Previously, if the user didn't have the appropriate permissions to execute the
binary/script, they would see "command not found", which is confusing.

This commit eliminates the `which` crate in favour of `ichwh`, which deals
better with permissions by not dealing with them at all! This is closer to the
behaviour of `which` in many shells. Permission checks are then left up to the
caller to deal with.
2020-03-29 21:15:55 -04:00
2ddab3e8ce Some small improvements to du readability.
Mostly, making more use of `map` and `map_err` in `Result`. One benefit
is that at least one location had duplicated logic for how to map the
error, which is no longer the case after this commit.
2020-03-29 17:03:01 -04:00
35dc7438a5 Make use of interruptible stream in various places 2020-03-29 17:03:01 -04:00
2a54ee0c54 Introduce InterruptibleStream type.
An interruptible stream can query an `AtomicBool. If that bool is true,
the stream will no longer produce any values.

Also introducing the `Interruptible` trait, which extends any `Stream`
with the `interruptible` function, to simplify the construction and
allow chaining.
2020-03-29 17:03:01 -04:00
cad2741e9e Split input and output streams into separate modules 2020-03-29 17:03:01 -04:00
ae5f3c8210 WIP: 1486/first row as headers (#1530)
* headers plugin

* Remove plugin

* Add non-functioning headers command

* Add ability to extract headers from first row

* Refactor header extraction

* Rebuild indexmap with proper headers

* Rebuild result properly

* Compiling, probably wrapped too much?

* Refactoring

* Deal with case of empty header cell

* Deal with case of empty header cell

* Fix formatting

* Fix linting, attempt 2.

* Move whole_stream_command(Headers) to more appropriate section

* ... more linting

* Return Err(ShellError...) instead of panic, yield each row instead of entire table

* Insert Column[index] if no header info is found.

* Update error description

* Add initial test

* Add tests for headers command

* Lint test cases in headers

* Change ShellError for headers, Add sample_headers file to utils.rs

* Add empty sheet to test file

* Revert "Add empty sheet to test file"

This reverts commit a4bf38a31d.

* Show error message when given empty table
2020-03-29 15:05:57 +13:00
a5e97ca549 Respect CARGO_TARGET_DIR when set (#1528)
This makes the `binaries` function respect the `CARGO_TARGET_DIR` environment variable when set. If it's not present it falls back to the regular target directory used by Cargo.
2020-03-27 17:13:59 -04:00
06f87cfbe8 Add support for removing multiple files at once (#1526) 2020-03-25 16:19:01 -04:00
d4e78c6f47 Improve the rotated row wrap (#1524) 2020-03-25 06:27:16 +13:00
3653400ebc testing fix to matrix to define all variables (#1522)
there is currently a bug with invalid syntax for some of the
docker build steps, and I think this is because there are build
variables in the matrix that are not defined. This PR will
attempt to resolve this issue by defining all missing variables
for each row in the matrix.

Signed-off-by: vsoch <vsochat@stanford.edu>
2020-03-24 16:20:39 +13:00
81a48d6d0e Fix '/' and '..' not being valid mv targets (#1519)
* Fix '/' and '..' not being valid mv targets

If `/` or `../` is specified as the destination for `mv`, it will fail with an error message saying it's not a valid destination. This fixes it to account for the fact that `Path::file_name` return `None` when the file name evaluates to `/` or `..`. It will only take the slow(er) path if `Path::file_name` returns `None` in its initial check.

Fixes #1291

* Add test
2020-03-24 14:00:48 +13:00
f030ab3f12 Add experimental auto-rotate (#1516) 2020-03-23 09:55:30 +13:00
0dc0c6a10a Add quickstart option to Docker section in README (#1515) 2020-03-23 09:18:50 +13:00
53c8185af3 Fixes the crash for ps --full in Windows (#1514)
* Fixes the crash for `ps --full` in Windows

* Update ps.rs
2020-03-23 08:28:02 +13:00
36b5d063c1 Simplify and improve listing for which. (#1510)
* Simplified implementation
* Show executables, even if the current user doesn't have permissions to
  execute them.
2020-03-22 09:11:39 -04:00
a7ec00a037 Add documentation for from-ics and from-vcf (#1509) 2020-03-21 14:50:13 +13:00
918822ae0d Fix numeric comparison with nothing (#1508) 2020-03-21 11:02:49 +13:00
ab5e24a0e7 WIP: Add vcard/ical support (#1504)
* Initial from-ical implementation

* Initial from-vcard implementation

* Rename from-ics and from-vcf for autoconvert

* Remove redundant clones

* Add from-vcf and from-ics tests

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-03-21 08:35:09 +13:00
b5ea522f0e Add a --full mode to ps (#1507)
* Add a --full mode to ps

* Use a slightly older heim
2020-03-20 20:53:49 +13:00
afa963fd50 Add is_dir check to auto-cd (#1506)
* Add markdown output

* Add is_dir() check
2020-03-20 16:57:36 +13:00
1e343ff00c Add markdown output (#1503) 2020-03-20 08:18:24 +13:00
21a543a901 Make sum plugin as internal command. (#1501) 2020-03-18 18:46:00 -05:00
390deb4ff7 Windows needs to remember auto-cd paths when changing drives (#1500)
* Windows needs to remember auto-cd paths when changing drives

* Windows needs to remember auto-cd paths when changing drives
2020-03-18 15:10:45 +13:00
1c4cb30d64 Add documentation for skip and skip-while (#1499) 2020-03-18 14:22:35 +13:00
1ec2ec72b5 Add automatic change directory (#1496)
* Allow automatic cd in cli mode

* Set correct priority for auto-cd and add test
2020-03-18 07:13:38 +13:00
0d244a9701 Open fails silently, fix #1493 (#1495)
* Fix #1493

The error was wrongfully discarded

* Run cargo fmt
2020-03-17 17:40:04 +13:00
b36d21e76f Infer types from regular delimited plain text unstructured files. (#1494)
* Infer types from regular delimited plain text unstructured files.

* Nothing resolves to an empty string.
2020-03-16 15:50:45 -05:00
d8c4565413 Csv errors (#1490)
* Add error message for csv parsing failures

* Add csv error prettyfier

* Improve readability of the error

Line 2: error is easier to understand than:
Line 2, error

* Remove unnecessary use of the format! macro

Replacing it with .to_string() fixes a clippy warning

* Improve consistency with JSON parsing errors
2020-03-16 12:32:02 -05:00
22ba4c2a2f Add svg support to to-html (#1492) 2020-03-16 20:19:18 +13:00
8d19b21b9f Custom canonicalize method on Filesystem Shell. (#1485)
* Custom canonicalize method for FilesystemShell.

* Use custom canonicalize method.
Fixed missing import.

* Move function body to already impl body.

* Create test that aims to resolve.
2020-03-16 19:28:18 +13:00
45a3afdc79 Update Cargo.toml 2020-03-16 06:12:28 +13:00
2d078849cb Add simple to-html output and bump version (#1487) 2020-03-15 16:04:44 +13:00
b6363f3ce1 Added new flag '--all/-a' for Ls, also refactor some code (#1483)
* Utility function to detect hidden folders.
Implemented for Unix and Windows.

* Rename function argument.

* Revert "Rename function argument."

This reverts commit e7ab70f0f0.

* Add flag '--all/-a' to Ls

* Rename function argument.

* Check if flag '--all/-a' is present and path is hidden.
Replace match with map_err for glob result.
Remove redundancy in stream body.
Included comments on new stream body.
Replace async_stream::stream with async_stream::try_stream.
Minor tweaks to is_empty_dir.
Fix and refactor is_hidden_dir.

* Fix "implicit" bool coerse

* Fixed clippy errors
2020-03-14 06:27:04 +13:00
5ca9e12b7f Fix whitespace and typos (#1481)
* Remove EOL whitespace in files other than docs

* Break paragraphs into lines

See http://rhodesmill.org/brandon/2012/one-sentence-per-line/ for the rationale

* Fix various typos

* Remove EOL whitespace in docs/commands/*.md
2020-03-14 06:23:41 +13:00
5b0b2f1ddd Fixes #1204 : sys | get host.users displays the same user (#1480)
account twice while only one exists (macOS)

- renamed host.users to host.sessions
2020-03-12 14:01:55 +13:00
3afb53b8ce fix typo in calc command documentation (#1477)
minimumum -> minimum
2020-03-11 11:20:22 -04:00
b40d16310c More relaxed file modes for now. (#1476) 2020-03-11 13:19:15 +13:00
d3718d00db Merge shuffle nu plugin as core command. (#1475) 2020-03-10 17:00:08 -05:00
f716f61fc1 Update Cargo.lock 2020-03-11 08:53:26 +13:00
b2ce669791 Update Cargo.toml 2020-03-11 08:51:53 +13:00
cd155f63e1 Update Cargo.toml 2020-03-11 08:51:17 +13:00
9eaa6877f3 Update Cargo.toml 2020-03-11 08:50:51 +13:00
a6b6afbca9 Update Cargo.toml 2020-03-11 08:08:13 +13:00
62666bebc9 Bump to 0.11.0 (#1474) 2020-03-11 06:34:19 +13:00
d1fcce0cd3 Fixes #1427: Prints help message with -h switch (#1454)
For some commands like `which` -h flag would trigger an error asking for
missing required parameters instead of printing the help message as it
does with --help. This commit adds a check in the command parser to
avoid that.
2020-03-11 05:59:50 +13:00
a2443fbe02 Remove unused parsing logic. (#1473)
* Remove unused parsing logic.

* Run tokens iteration tests baseline.

* Pass lint.

* lifetimes can be elided without being explicit.
2020-03-10 04:31:42 -05:00
db16b56fe1 Columnpath support when passing fields for formatting. (#1472) 2020-03-10 01:55:03 -05:00
54bf671a50 Fix deleting / showing ls named pipes and other fs objects no… (#1461)
* Fix deleting named pipes
* Use std::os::unix::fs::FileTypeExt to show correct type for unix-specific fs objects; Fix formatting

Co-authored-by: Linards Kalvāns <linards.kalvans@twino.eu>
2020-03-09 09:02:53 -04:00
755d0e648b Eliminate some compiler warnings (#1468)
- Unnecessary parentheses
- Deprecated `description()` method
2020-03-09 08:19:07 +13:00
e440d8c939 Bump some deps (#1467) 2020-03-09 08:18:44 +13:00
01dd358a18 Don't emit a newline in autoview. (#1466)
The extra newline character makes it hard to use nu as part of an
external processing pipeline, since the extra character could taint the
results. For example:

```
$ nu -c 'echo test | xxd'
00000000: 7465 7374                                test
```

versus

```
nu -c 'echo test' | xxd
00000000: 7465 7374 0a                             test.
```
2020-03-09 08:18:24 +13:00
50fb97f6b7 Merge env into $nu and simplify table/get (#1463) 2020-03-08 18:33:30 +13:00
ebf139f5e5 Auto-detect string / binary in save command (#1459)
* Auto-detect string / binary in save command

* Linter
2020-03-08 07:33:29 +13:00
8925ca5da3 Move to bytes/string hybrid codec (#1457)
* WIP: move to bytes codec

* Progress on adding collect helpers

* Progress on adding collect helpers

* Add in line splitting back to lines

* Lines outputting line primitives

* Close to ready?

* Finish fixing lines

* clippy fixes

* fmt fixes

* removed unused code

* Cleanup a few bits

* Cleanup a few bits

* Cleanup a few more bits

* Fix failing test with corrected test case
2020-03-07 05:06:39 +13:00
287652573b Fix and refactor cd for Filesystem Shell. (#1452)
* Fix and refactor cd for Filesystem Shell.
Reorder check conditions, don't check existence twice.
If building for unix check exec bit on folder.

* Import PermissionsExt only on unix target.

* It seems that this is the correct way?
2020-03-06 20:13:47 +13:00
db24ad8f36 Add --num parameter to limit the number of output lines (#1455)
Add `--num` parameter to limit the numer of returned elements
2020-03-05 05:26:46 -05:00
f88674f353 Nu internals are logged under nu filter. (#1451) 2020-03-05 05:18:53 -05:00
59cb0ba381 Color appropiately commands. (#1453) 2020-03-04 23:22:42 -05:00
c4cfab5e16 Make feature options available downstream to nu-cli subcrate. (#1450) 2020-03-04 15:31:12 -05:00
b2c5af457e Move most of the root package into a subcrate. (#1445)
This improves incremental build time when working on what was previously
the root package. For example, previously all plugins would be rebuilt
with a change to `src/commands/classified/external.rs`, but now only
`nu-cli` will have to be rebuilt (and anything that depends on it).
2020-03-04 13:58:20 -05:00
c731a5b628 Columns can be renamed. (#1447) 2020-03-03 16:01:24 -05:00
f97f9d4af3 Update deps locklfile (#1446)
Update deps lockfile
2020-03-03 15:34:22 -05:00
ed7d3fed66 Add shuffle plugin (#1443)
* Add shuffle plugin

see #1437

* Change plugin to integrate into nu structure and build system
2020-03-03 08:44:12 +13:00
7304d06c0b Use threads to avoid blocking reads/writes in externals. (#1440)
In particular, one thing that we can't (properly) do before this commit
is consuming an infinite input stream. For example:

```
yes | grep y | head -n10
```

will give 10 "y"s in most shells, but blocks indefinitely in nu. This PR
resolves that by doing blocking I/O in threads, and reducing the `await`
calls we currently have in our pipeline code.
2020-03-02 06:19:09 +13:00
ca615d9389 Bump to 0.10.1 (#1442) 2020-03-01 20:59:13 +13:00
6d096206b6 Add support for compound shorthand flags (#1414)
* Break multicharacter shorthand flags into single character flags

* Remove shorthand flag test
2020-03-01 13:20:42 +13:00
2a8cb24309 Add support for downloading unsupported mime types (#1441) 2020-03-01 13:14:36 +13:00
8d38743e27 Add docs for debug (#1438)
* Add docs for `debug`

* Put debug docs in right folder
Also fixed minor spacing problem
2020-03-01 04:09:28 +13:00
eabfa2de54 Let ls ignore permission errors (#1435)
* Create a function to create an empty directory entry

* Print an empty directory entry if permission is denied

* Fix rustfmt whitespace issues.

* Made metadata optional for `dir_entry_dict`.

Removed `empty_dir_entry_dict` as its not needed anymore.
2020-02-29 14:33:52 +13:00
a86a0abb90 Plugin documentation (#1431)
* Add very basic documentation. Need to play with rest of the api to figure out what it does

* Add some documentation to more of the Plugin API methods

* fmt
2020-02-24 15:28:46 +13:00
adcda450d5 Update LICENSE 2020-02-21 10:49:46 +13:00
147b9d4436 Add Better-TOML (#1417) 2020-02-19 16:59:42 -05:00
c43a58d9d6 Fix incorrect display for zero-size files (#1422) 2020-02-19 09:57:58 -05:00
e38442782e Command documentation for du (#1416) 2020-02-19 09:55:22 +13:00
b98f893217 add a touch command (#1399) 2020-02-19 09:54:32 +13:00
bd6556eee1 Use proper file extension for uniq command docs (#1411) 2020-02-18 09:37:46 -05:00
18d988d4c8 Restrict short-hand flag detection to exact match. (#1406) 2020-02-18 01:58:30 -05:00
0f7c723672 Bump version to 0.10.0 (#1403) 2020-02-18 16:56:09 +13:00
afce2fd0f9 Revert "Display rows in the same table regardless of their column order given they are equal. (#1392)" (#1401)
This reverts commit 4fd9974204.
2020-02-17 17:34:37 -08:00
4fd9974204 Display rows in the same table regardless of their column order given they are equal. (#1392) 2020-02-16 20:35:01 -05:00
71615f77a7 Fix minor typo in calc command error (#1395) 2020-02-16 16:02:41 -05:00
9bc5022c9c Force a \n at the end of a stdout stream (#1391)
* Force a \n at the end of a stdout stream

* clippy
2020-02-14 18:15:32 -08:00
552848b8b9 Leave raw mode correctly. (#1388)
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-02-14 17:31:21 -05:00
8ae8ebd107 Add support for multiline script files (#1386)
* Add support for multiline script files

* clippy
2020-02-13 21:24:18 -08:00
473e9f9422 Tiny improvement to sys (#1385) 2020-02-13 08:33:55 -08:00
96985aa692 Fix invalid shorthand flag (#1384) 2020-02-13 07:47:34 -08:00
0961da406d Add string to datetime to str plugin (#1381)
* Add string to datetime to str plugin

* Test string to date/time conversion
2020-02-13 07:47:04 -08:00
84927d52b5 Refuse internal command execution given unexpected arguments. (#1383) 2020-02-13 02:34:43 -05:00
73312b506f Finer grained parsing and coloring command tail. (#1382) 2020-02-12 20:20:19 -05:00
c1bec3b443 Return error on a divide by zero (#1376)
Return error on a divide by zero
2020-02-12 08:38:04 -05:00
c0be02a434 Short-hand flags (#1378)
* typo fixes

* Change signature to take in short-hand flags

* update help information

* Parse short-hand flags as their long counterparts

* lints

* Modified a couple tests to use shorthand flags
2020-02-11 18:24:31 -08:00
2ab8d035e6 External it and nu variable column path fetch support. (#1379) 2020-02-11 18:25:56 -05:00
24094acee9 Allow switch flags anywhere in the pipeline. (#1375) 2020-02-11 03:49:00 -05:00
0b2be52bb5 Only add quotes if not in Windows (which adds its own?) (#1374)
* Only add quotes if not in Windows (which adds its own?)

* Only add quotes if not in Windows (which adds its own?)
2020-02-10 23:07:44 -08:00
6a371802b4 Add block size to du (#1341)
* Add block size to du

* Change blocks to physical size

* Use path instead of strings for file/directory names

* Why don't I just use paths instead of strings anyway?

* shorten physical size and apparent size to physical and apparent resp.
2020-02-10 12:32:18 -08:00
29ccb9f5cd Ensure stable plugins get installed. (#1373) 2020-02-10 15:32:10 -05:00
20ab125861 bump version (#1370) 2020-02-10 09:18:00 -08:00
fb532f3f4e Prototype shebang support (#1368)
* Add shebang support to nu.

* Move test file

* Add test for scripts

Co-authored-by: Jason Gedge <jason.gedge@shopify.com>
2020-02-10 08:49:45 -08:00
a29d52158e Do not panic when failing to decode lines from external stdout (#1364) 2020-02-10 07:37:48 -08:00
dc50e61f26 Switch stdin redirect to manual. Add test (#1367) 2020-02-09 22:55:07 -08:00
a2668e3327 Add some nu_source docs for meta.rs (#1366)
* Add some docs for meta.rs

* add better explanation for Span merging

* Add some doc tests - not sure how to get them to run

* get rid of doc comments for the temporary method

* add doc test for is_unknown

* fmt
2020-02-09 18:08:14 -08:00
e606407d79 Add error codes to -c (#1361) 2020-02-08 20:04:53 -08:00
5f4fae5b06 Pipeline sink refactor (#1359)
* Refactor pipeline ahead of block changes. Add '-c' commandline option

* Update pipelining an error value

* Fmt

* Clippy

* Add stdin redirect for -c flag

* Add stdin redirect for -c flag
2020-02-08 18:24:33 -08:00
3687603799 Only spawn external once when no $it argument (#1358) 2020-02-08 17:57:05 -08:00
643b532537 Fixed mv not throwing error when the source path was invalid (#1351)
* Fixed mv not throwing error when the source path was invalid

* Fixed failing test

* Fixed another lint error

* Fix $PATH conflicts in .gitpod.Dockerfile (#1349)

- Use the correct user for gitpod Dockerfile.
- Remove unneeded packages (curl, rustc) from gitpod Dockerfile.

* Added test to check for the error

* Fixed linting error

* Fixed mv not moving files on Windows. (#1342)

Move files correctly in windows.

* Fixed mv not throwing error when the source path was invalid

* Fixed failing test

* Fixed another lint error

* Added test to check for the error

* Fixed linting error

* Changed error message

* Typo and fixed test

Co-authored-by: Sean Hellum <seanhellum45@gmail.com>
2020-02-07 12:40:48 -05:00
ed86b1fbe8 Fixed mv not moving files on Windows. (#1342)
Move files correctly in windows.
2020-02-07 11:24:01 -05:00
44a114111e Fix $PATH conflicts in .gitpod.Dockerfile (#1349)
- Use the correct user for gitpod Dockerfile.
- Remove unneeded packages (curl, rustc) from gitpod Dockerfile.
2020-02-06 15:20:18 -05:00
812a76d588 Update more futures-preview to futures (#1346) 2020-02-05 20:28:42 -08:00
e3be849c2a Futures v0.3 upgrade (#1344)
* Upgrade futures, async-stream, and futures_codec

These were the last three dependencies on futures-preview. `nu` itself
is now fully dependent on `futures@0.3`, as opposed to `futures-preview`
alpha.

Because the update to `futures` from `0.3.0-alpha.19` to `0.3.0` removed
the `Stream` implementation of `VecDeque` ([changelog][changelog]), most
commands that convert a `VecDeque` to an `OutputStream` broke and had to
be fixed.

The current solution is to now convert `VecDeque`s to a `Stream` via
`futures::stream::iter`. However, it may be useful for `futures` to
create an `IntoStream` trait, implemented on the `std::collections` (or
really any `IntoIterator`). If something like this happends, it may be
worthwhile to update the trait implementations on `OutputStream` and
refactor these commands again.

While upgrading `futures_codec`, we remove a custom implementation of
`LinesCodec`, as one has been added to the library. There's also a small
refactor to make the stream output more idiomatic.

[changelog]: https://github.com/rust-lang/futures-rs/blob/master/CHANGELOG.md#030---2019-11-5

* Upgrade sys & ps plugin dependencies

They were previously dependent on `futures-preview`, and `nu_plugin_ps`
was dependent on an old version of `futures-timer`.

* Remove dependency on futures-timer from nu

* Update Cargo.lock

* Fix formatting

* Revert fmt regressions

CI is still on 1.40.0, but the latest rustfmt v1.41.0 has changes to the
`val @ pattern` syntax, causing the linting job to fail.

* Fix clippy warnings
2020-02-05 19:46:48 -08:00
ba1b67c072 Attempt rustup update on each PR (#1345)
* Attempt update on each PR

* Update fmt
2020-02-05 19:28:49 -08:00
fa910b95b7 Have from-ssv not fail for header-only inputs (#1334) 2020-02-05 11:54:14 -08:00
427bde83f7 Allow cp to overwrite existing files (#1339) 2020-02-05 01:54:05 -05:00
7a0bc6bc46 Opt-out unused heim features from sys/ps plugins. (#1335) 2020-02-04 01:51:14 -05:00
c6da56949c Add support for plugin names containing numbers (#1321)
* Add ability to have numbers in plugin name. Plugin must start with alphabetic char

* remove the first character as alphabetic requirement

* Update cli.rs

Going ahead and changing to plus to prevent issue notryanb found

* Update cli.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-02-01 22:08:38 -08:00
5b398d2ed2 Adding cross-platform kill command (#1326)
* Adding kill command, unclean code

* Removing old comments

* Added quiet option, supports variable number of ids

* Made it per_item_command, calling commands directly without the shell
2020-02-01 10:46:28 -08:00
dcdfa2a866 Improve tests and labeling in FilesystemShell (#1305)
Additional `ls` command tests and better FilesystemShell error and label messages.
2020-02-01 03:34:34 -05:00
9474fa1ea5 Improved code in du command (#1320)
Made the code a little easier to read
2020-02-01 03:32:06 -05:00
49a1385543 Make tests work from directory names with spaces (#1325) 2020-01-31 22:12:56 -08:00
6427ea2331 Update Cargo.lock for ichwh fix (#1312)
`ichwh@0.3.1` fixes a bug that causes path searches to fail. We update
`Cargo.lock` to fix this.

Resolves #1207
2020-01-31 22:11:42 -08:00
3610baa227 Default plugins are independent and called from Nu. (#1322) 2020-01-31 17:45:33 -05:00
4e201d20ca Paths from Nu config take priority over external paths. (#1319) 2020-01-31 14:19:47 -05:00
1fa21ff056 Exclude images to reduce crate by 3MB (#1316)
Maybe there are more candidates for exclusion, but 'images/'
seemed obviously unnecessary.

Something I started realizing lately is that cargo puts most
of the root directory into the crate archive, causing huge
crates to appear on crates.io.

Now that I am in China, I do seem to notice every kilobyte.
2020-01-31 10:38:26 -05:00
0bbd12e37f Improve the default help message (#1313) 2020-01-30 20:13:14 -08:00
7df8fdfb28 Rename the now-deprecated add command docs into insert comm… (#1307) 2020-01-30 08:15:20 -05:00
6a39cd8546 Add docs for the calc command (#1290) 2020-01-29 08:34:54 -05:00
dc3370b103 Make a calc command (#1280) 2020-01-29 08:34:36 -05:00
ac5ad45783 Pretty Nu print default, pretty print regular secondary as raw flag. (#1302) 2020-01-29 02:46:54 -05:00
8ef5c47515 Update cargo flags (#1295)
* Update cargo flags

See https://github.com/nushell/nushell.github.io/issues/29

* Update to suggested flag
2020-01-28 22:44:49 -08:00
5b19bebe7d Isolate environment state changes into Host. (#1296)
Moves the state changes for setting and removing environment variables
into the context's host as opposed to calling `std::env::*` directly
from anywhere else.

Introduced FakeHost to prevent environemnt state changes leaking
between unit tests and cause random test failures.
2020-01-29 00:40:06 -05:00
2c529cd849 Fix bug where --with-symlink-targets would not display the targets column (#1300) 2020-01-28 21:36:20 -08:00
407f36af29 Remove unused dep (#1298) 2020-01-29 16:44:03 +13:00
763fcbc137 Bump to 0.9.0 (#1297) 2020-01-29 15:17:02 +13:00
7061af712e ls will return error if no files/folders match path/pattern (#1286)
* `ls` will return error if no files/folders match path/pattern

* Revert changes to src/data/files.rs

* Add a name_only flag to dir_entry_dict

Add name_only flag to indicate if the caller only cares about filenames
or wants the whole path

* Update ls changes from feedback

* Little cleanup

* Resolve merge conflicts

* lints
2020-01-29 05:58:31 +13:00
9b4ba09c95 Nu env vars from config have higher priority. (#1294) 2020-01-28 02:10:15 -05:00
9ec6d0c90e Add --with-symlink-targets option for the ls command that displays a new column for the target files of symlinks (#1292)
* Add `--with-symlink-targets` option for the `ls` command that displays a new column for the target files of symlinks

* Fix clippy warning

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-01-28 19:48:41 +13:00
f20a4a42e8 Let's the expander look for tokens from start. (#1293) 2020-01-28 01:03:28 -05:00
caa6830184 Baseline environment and configuration work. (#1287) 2020-01-27 22:13:22 -05:00
f8be1becf2 Updated rustyline to 6.0.0. Added completion_mode config (#1289)
* Updated rustyline to 6.0.0. Added completion_mode config

* Formatted completion_mode config
2020-01-27 16:41:17 +13:00
af51a0e6f0 Update motto 2020-01-27 16:32:02 +13:00
23d11d5e84 Nu source overview (#1282)
* add some notes into README for more elaboration

* rewrite the overview

* remove unused first line

* add last part about tracing and debugging

* change the wording to make it easier to read

* Add example of metadata system

* Add contact information as other helpful links
2020-01-27 15:55:02 +13:00
6da9e2aced Upgrade crossterm (#1288)
* WIP

* Finish porting to new crossterm

* Fmt
2020-01-27 15:51:46 +13:00
32dfb32741 Switch from subprocess crate to the builtin std::process (#1284)
* Switch from subprocess crate to the builtin std::process

* Update external.rs

* Update external.rs

* Update external.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-01-26 16:03:21 +13:00
d48f99cb0e compute directory sizes from contained files and directories (#1250)
* compute directory sizes from contained files and directories

* De-lint

* Revert "De-lint"

This reverts commit 9df9fc07d777014fef8f5749a84b4e52e1ee652a.

* Revert "compute directory sizes from contained files and directories"

This reverts commit d43583e9aa20438bd613f78a36e641c9fd48cae3.

* Nu du command

* Nu du for you

* Add async support

* Lints

* so much bug fixing
2020-01-26 15:43:29 +13:00
35359cbc22 Buffer tables until a timeout or threshold is met (#1283) 2020-01-26 09:09:51 +13:00
b52dbcc8ef Separate dissimilar tables into separate tables (#1281)
* Allow the table command to stream

* Next part of table view refactor
2020-01-26 07:10:20 +13:00
4429a75e17 Make ls show only the file name (#1276)
* Make ls show only the file name

* Refactor and remove unwraps

* Put functionality in separate flag
2020-01-26 05:20:33 +13:00
583f27dc41 Added attributes to from-xml command (#1272)
* Added attributes to from-xml command

* Added attributes as their own rows

* Removed unneccesary lifetime declarations

* from-xml now has children and attributes side by side

* Fixed tests and linting

* Fixed lint-problem
2020-01-26 05:16:40 +13:00
83db5c34c3 Add docs for the from-ods and from-xlsx commands (#1279) 2020-01-26 04:31:20 +13:00
cdbfdf282f Allow the table command to stream (#1278) 2020-01-25 16:13:12 +13:00
a5e1372bc2 RM error on bad filename (#1244)
* rm error on bad filename

* De-lint

* Fix error message in test
2020-01-25 08:16:41 +13:00
798a24eda5 Soften restrictions for external parameters (#1277)
* Soften restrictions for external parameters

* Add test
2020-01-25 08:14:49 +13:00
a2bb23d78c Update README.md 2020-01-25 06:51:24 +13:00
d38a63473b Improve shelling out (#1273)
Improvements to shelling out
2020-01-24 08:24:31 +13:00
2b37ae3e81 Switch to using subprocess::shell (#1264)
* Switch to using `shell`

Switch to using the shell for subprocess to enable more natural shelling out.

* Update external.rs

* This is a test with .shell() for external

* El pollo loco's PR

* co co co

* Attempt to fix windows

* Fmt

* Less is more?

Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
2020-01-24 05:21:05 +13:00
bc5a969562 [Gitpod] Add some VSCode extensions (#1268)
VSCode extensions for productive work.
2020-01-23 00:51:08 -05:00
fe4ad5f77e Color named type help especial case. (#1263)
Refactored out help named type as switch.
2020-01-22 19:36:48 -05:00
07191754bf Update ichwh to 3.0 (#1267)
The [newest update for ichwh][changes] introduced some breaking changes.
This PR bumps the version and refactors the `which` command to take
these into account.

[changes]: https://gitlab.com/avandesa/ichwh-rs/blob/master/CHANGELOG.md#030-2020-01-22
2020-01-23 12:26:49 +13:00
66bd331ba9 Make futures-timer a non-optional dependency (#1265)
Originally, it was only brought in with the `ps` feature enabled.
However, commit #ba7a17, made the crate used in
`src/commands/classified/external.rs` unconditionally, causing the build
to fail when built without the `ps` feature.

This commit fixes the problem by making it a non-optional dependency.
2020-01-23 10:56:29 +13:00
762c798670 It ls test setup rewrite. (#1260) 2020-01-21 22:56:12 -05:00
3c01526869 Test binaries no longer belong to stable or default features. (#1259) 2020-01-21 22:00:27 -05:00
7efb31a4e4 Restructure and streamline token expansion (#1123)
Restructure and streamline token expansion

The purpose of this commit is to streamline the token expansion code, by
removing aspects of the code that are no longer relevant, removing
pointless duplication, and eliminating the need to pass the same
arguments to `expand_syntax`.

The first big-picture change in this commit is that instead of a handful
of `expand_` functions, which take a TokensIterator and ExpandContext, a
smaller number of methods on the `TokensIterator` do the same job.

The second big-picture change in this commit is fully eliminating the
coloring traits, making coloring a responsibility of the base expansion
implementations. This also means that the coloring tracer is merged into
the expansion tracer, so you can follow a single expansion and see how
the expansion process produced colored tokens.

One side effect of this change is that the expander itself is marginally
more error-correcting. The error correction works by switching from
structured expansion to `BackoffColoringMode` when an unexpected token
is found, which guarantees that all spans of the source are colored, but
may not be the most optimal error recovery strategy.

That said, because `BackoffColoringMode` only extends as far as a
closing delimiter (`)`, `]`, `}`) or pipe (`|`), it does result in
fairly granular correction strategy.

The current code still produces an `Err` (plus a complete list of
colored shapes) from the parsing process if any errors are encountered,
but this could easily be addressed now that the underlying expansion is
error-correcting.

This commit also colors any spans that are syntax errors in red, and
causes the parser to include some additional information about what
tokens were expected at any given point where an error was encountered,
so that completions and hinting could be more robust in the future.

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
2020-01-21 17:45:03 -05:00
c8dd7838a8 Bump Pipeline images (#1255)
* Bump Pipeline images

* Update azure-pipelines.yml

* Update azure-pipelines.yml
2020-01-22 10:39:31 +13:00
3b57ee5dda Add internal clear command (#1249)
* Add clear.rs

* update

* update

* cross-platformify

* update

* fix

* format

* fix warnings

* update implementation

* remove return

* remove semicolon

* change from `.output()` to `.status()`

* format
2020-01-20 20:05:32 +13:00
fb977ab941 add automated setup badge and add .gitpod.yml patch (#1246)
* add automated setup badge and add .gitpod.yml patch

* Update .gitpod.yml
2020-01-20 14:40:04 +13:00
e059c74a06 Add support for primitive values to sort-by (#1241)
* Remove redundant clone

* Add support for primitive values to sort-by #1238
2020-01-20 08:08:36 +13:00
47d987d37f Add ctrl_c to RunnablePerItemContext. (#1239)
Also, this commit makes `ls` a per-item command.

A command that processes things item by item may still take some time to stream
out the results from a single item. For example, `ls` on a directory with a lot
of files could be interrupted in the middle of showing all of these files.
2020-01-19 15:25:07 +13:00
3abfefc025 More docs and random fixes (#1237) 2020-01-19 08:42:36 +13:00
a5c5b4e711 Add --help for commands (#1226)
* WIP --help works for PerItemCommands.

* De-linting

* Add more comments (#1228)

* Add some more docs

* More docs

* More docs

* More docs (#1229)

* Add some more docs

* More docs

* More docs

* Add more docs

* External commands: wrap values that contain spaces in quotes (#1214) (#1220)

* External commands: wrap values that contain spaces in quotes (#1214)

* Add fn's argument_contains_whitespace & add_quotes (#1214)

*  Fix formatting with cargo fmt

* Don't wrap argument in quotes when $it is already quoted (#1214)

* Implement --help for internal commands

* Externals now spawn independently. (#1230)

This commit changes the way we shell out externals when using the `"$it"` argument. Also pipes per row to an external's stdin if no `"$it"` argument is present for external commands. 

Further separation of logic (preparing the external's command arguments, getting the data for piping, emitting values, spawning processes) will give us a better idea for lower level details regarding external commands until we can find the right abstractions for making them more generic and unify within the pipeline calling logic of Nu internal's and external's.

* Poll externals quicker. (#1231)

* WIP --help works for PerItemCommands.

* De-linting

* Implement --help for internal commands

* Make having --help the default

* Update test to include new default switch

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
Co-authored-by: Koenraad Verheyden <mail@koenraadverheyden.com>
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
2020-01-18 11:46:18 +13:00
ba9cb753d5 Bump some of our dependencies (#1234) 2020-01-18 09:35:48 +13:00
ba7a1752db Poll externals quicker. (#1231) 2020-01-16 06:27:12 -05:00
29431e73c2 Externals now spawn independently. (#1230)
This commit changes the way we shell out externals when using the `"$it"` argument. Also pipes per row to an external's stdin if no `"$it"` argument is present for external commands. 

Further separation of logic (preparing the external's command arguments, getting the data for piping, emitting values, spawning processes) will give us a better idea for lower level details regarding external commands until we can find the right abstractions for making them more generic and unify within the pipeline calling logic of Nu internal's and external's.
2020-01-16 04:05:53 -05:00
d29fe6f6de External commands: wrap values that contain spaces in quotes (#1214) (#1220)
* External commands: wrap values that contain spaces in quotes (#1214)

* Add fn's argument_contains_whitespace & add_quotes (#1214)

*  Fix formatting with cargo fmt

* Don't wrap argument in quotes when $it is already quoted (#1214)
2020-01-16 13:38:16 +13:00
e2e9abab0a More docs (#1229)
* Add some more docs

* More docs

* More docs

* Add more docs
2020-01-16 07:32:46 +13:00
2956b0b087 Add more comments (#1228)
* Add some more docs

* More docs

* More docs
2020-01-16 05:28:31 +13:00
b32eceffb3 Add some comments (#1225) 2020-01-14 20:38:56 +13:00
3adf52b1c4 update .gitignore to exclude target directories in the crate directory (#1221) 2020-01-14 20:14:24 +13:00
78a644da2b Restrict Nu with a cleaned environment. (#1222) 2020-01-13 23:17:20 -05:00
98028433ad $it: add conversion from Int for external commands (#1218) 2020-01-13 18:57:44 -05:00
2ab5803f00 $it: add conversion from Path for external commands (#1210)
* $it: add conversion from Path for external commands (#1203)

* Replace PathBuf::to_str with to_string_lossy
2020-01-14 05:41:18 +13:00
65980c7beb Revert 8cadc5a4 (#1211) 2020-01-13 19:38:58 +13:00
29fd8b55fb Keep dummies in default features for convenience. (#1212) 2020-01-13 01:17:56 -05:00
2f039b3abc Fix crash when attempting to enter help shell (#1201)
`enter help` would result in a crash
2020-01-13 17:27:00 +13:00
d3dae05714 Groundwork for coverage with Nu internals. (#1205) 2020-01-12 16:44:22 -05:00
5fd3191d91 Fix randomly failing test (#1200)
* Fix randomly failing test

* Fix randomly failing test
2020-01-13 06:03:28 +13:00
0dcd90cb8f Silence stdout for test runs. (#1198) 2020-01-12 04:14:10 -05:00
02d0a4107e A few ls improvements. New welcome message (#1195) 2020-01-12 09:49:20 +13:00
63885c4ee6 Change black to other colors (#1194) 2020-01-12 06:21:59 +13:00
147bfefd7e Sort ls case-insensitively by default (#1192) 2020-01-11 20:59:55 +13:00
60043df917 Allow ColumnPaths when picking tables. (#1191) 2020-01-11 01:45:09 -05:00
6d3a30772d Get error message improvements. (#1185)
More especific "get" command error messages + Test refactoring.
2020-01-10 10:44:24 -05:00
347f91ab53 Have internal/external/pipelines taken an optional InputStream. (#1182)
Primarily, this fixes an issue where we always open a stdin pipe for
external commands, which can break various interactive commands (e.g.,
editors).
2020-01-09 22:31:44 -08:00
5692a08e7f Update README.md 2020-01-10 09:40:30 +13:00
515a3b33f8 Thin-lines for tables for better rendering (#1181)
The thick lines are pretty subtle and some fonts have issues with it. Seems keeping the lines consistent works better across fonts.
2020-01-09 12:33:02 -08:00
c3e466e464 Make debug command always prettty-print (Resolves #1178) (#1180) 2020-01-09 11:24:21 -08:00
00c0327031 Support more Values to plain string. (#1169)
* Support more Values to plain string.

* Continue converting to delimited data for simple values.
2020-01-08 06:12:59 -05:00
7451414b9e Eliminate ClassifiedInputStream in favour of InputStream. (#1056) 2020-01-07 13:00:01 -08:00
41ebc6b42d Bump to 0.8.0 (#1166) 2020-01-07 20:08:31 +13:00
b574dc6365 Add the from-ods command (#1161)
* Put a sample_data.ods file for testing

This is a copy of the sample_data.xlsx file but in ods format

* Add the from-ods command

Most of the work was doing `rg xlsx` and then copy/paste with light editing

* Add tests for the from-ods command

* Fix failing test

The problem was improper filename sorting in the test `prepares_and_decorates_filesystem_source_files`
2020-01-07 19:35:00 +13:00
4af9e1de41 Resolves #750 (#1164)
Pick now produces an error when none of the columns are found
2020-01-07 17:06:48 +13:00
77d856fd53 Last unwraps (#1160)
* Work through most of the last unwraps

* Finish removing unwraps
2020-01-04 19:44:17 +13:00
6dceabf389 Isolate data processing helpers. (#1159)
Isolate data processing helpers. Remove unwraps and down to zero unwraps.
2020-01-03 23:00:39 -05:00
5919c6c433 Remove unwraps (#1153)
* Remove a batch of unwraps

* finish another batch
2020-01-04 10:11:21 +13:00
339a2de0eb More ununwraps (#1152)
* More ununwraps

* More ununwraps

* Update completer.rs

* Update completer.rs
2020-01-03 06:51:20 +13:00
3e3cb15f3d Yet more ununwraps (#1150) 2020-01-02 20:07:17 +13:00
5e31851070 A couple more (#1149) 2020-01-02 18:24:41 +13:00
0f626dd076 Another batch of un-unwrapping (#1148)
Another batch of un-unwrappings
2020-01-02 17:02:46 +13:00
aa577bf9bf Clean up some unwraps (#1147) 2020-01-02 09:45:32 +13:00
25298d35e4 Bump rustyline (#1146)
* Slightly improve new which command

* Bump rustyline
2020-01-02 06:54:25 +13:00
78016446dc Slightly improve new which command (#1145) 2020-01-01 20:47:25 +13:00
b304de8199 Rewrite which (#1144)
* Detect built-in commands passed as args to `which`

This expands the built-in `which` command to detect nushell commands
that may have the same name as a binary in the path.

* Allow which to interpret multiple arguments

Previously, it would discard any argument besides the first. This allows
`which` to process multiple arguments. It also makes the output a stream
of rows.

* Use map to build the output

* Add boolean column for builtins

* Use macros for entry creation shortcuts

* Process command args and use async_stream

In order to use `ichwh`, I'll need to use async_stream. But in order to
avoid lifetime errors with that, I have to process the command args
before using them. I'll admit I don't fully understand what is going on
with the `args.process(...)` function, but it works.

* Use `ichwh` for path searching

This commit transitions from `which` to `ichwh`. The path search is now
done asynchronously.

* Enable the `--all` flag on `which`

* Make `which` respect external commands

Escaped commands passed to wich (e.g., `which "^ls"`), are now searched
before builtins.

* Fix clippy warnings

This commit resolves two warnings from clippy, in light of #1142.

* Update Cargo.lock to get new `ichwh` version

`ichwh@0.2.1` has support for local paths.

* Add documentation for command
2020-01-01 19:45:27 +13:00
72838cc083 Move to using clippy (#1142)
* Clippy fixes

* Finish converting to use clippy

* fix warnings in new master

* fix windows

* fix windows

Co-authored-by: Artem Vorotnikov <artem@vorotnikov.me>
2019-12-31 20:36:08 +13:00
8093612cac Allow moving in text with Ctrl+ArrowLeft, Ctrl+ArrowRight (#1141)
* Allow moving in text with Ctrl+ArrowLeft, Ctrl+ArrowRight

* Document changes

* Format
2019-12-31 17:06:36 +13:00
f37f29b441 Add uniq command (#1132)
* start playing with ways to use the uniq command

* WIP

* Got uniq working, but still need to figure out args issue and add tests

* Add some tests for uniq

* fmt

* remove commented out code

* Add documentation and some additional tests showing uniq values and rows. Also removed args TODO

* add changes that didn't get committed

* whoops, I didn't save the docs correctly...

* fmt

* Add a test for uniq with nested json

* Add another test

* Fix unique-ness when json keys are out of order and make the test json more complicated
2019-12-31 17:05:02 +13:00
dba82ac530 handle single quoted external command args (#1139)
fixes #1138
2019-12-31 06:47:14 +13:00
0615adac94 Inc refactoring, Value helper test method extractions, and more integration helpers. (#1135)
* Manifests check. Ignore doctests for now.

* We continue with refactorings towards the separation of concerns between
crates. `nu_plugin_inc` and `nu_plugin_str` common test helpers usage
has been refactored into `nu-plugin` value test helpers.

Inc also uses the new API for integration tests.
2019-12-29 00:17:24 -05:00
21e508009f Refactor struct names for old commands (ls, cd, pwd) (#1133) 2019-12-29 10:33:31 +13:00
a9317d939f Update README.md 2019-12-28 15:27:51 +13:00
65d843c2a1 Merge pull request #1128 from andrasio/nu-plugin-extract
Extract nu-plugin crate.
2019-12-27 09:16:18 -05:00
f6c62bf121 Nu plugins now depend on nu-plugin crate. 2019-12-27 08:52:15 -05:00
b4bc5fe9af Merge pull request #1126 from jonathandturner/utf8_fix
UTF8 fix for twitter-reported issue
2019-12-27 19:48:42 +13:00
10368d7060 UTF8 fix for twitter-reported issue 2019-12-27 19:25:44 +13:00
68a314b5cb UTF8 fix for twitter-reported issue 2019-12-27 19:03:00 +13:00
3c7633ae9f Merge pull request #1125 from notryanb/update-readme
update readme to reflect >= 0.7.2 $nu variables
2019-12-27 15:42:25 +13:00
dba347ad00 update readme to show >= 0.7 nu path 2019-12-26 20:08:30 -05:00
bfba2c57f8 Merge pull request #1124 from quebin31/master
Fix positional macro on crate nu-macros
2019-12-27 07:16:47 +13:00
c69bf9f46f Merge branch 'master' of https://github.com/nushell/nushell 2019-12-26 12:32:28 -05:00
7ce1ddc6fd Fixed optional and required argument in signature.
This fixes issues like #1117
2019-12-26 12:29:41 -05:00
e7ce6f2fcd Merge pull request #1113 from jonathandturner/bump_0_7_2
Bump to 0.7.2
2019-12-24 14:51:58 +13:00
0c786bb890 Bump to 0.7.2 2019-12-24 14:51:10 +13:00
8d31c32bda Merge pull request #1112 from jonathandturner/assorted_fixes
Fix an assortment of issues
2019-12-24 14:45:15 +13:00
e7fb15be59 Fix an assortment of issues 2019-12-24 14:26:47 +13:00
be7550822c Merge pull request #1109 from nushell/ctrl_l_clear
Move to git rustyline to fix Ctrl-L
2019-12-24 05:48:42 +13:00
0ce216eec4 Move to git rustyline to fix Ctrl+L 2019-12-24 05:26:30 +13:00
1fe85cb91e Merge pull request #1108 from thegedge/faster-pipelines
Wait for process instead of polling its status.
2019-12-23 07:06:16 +13:00
8cadc5a4ac Wait for process instead of polling its status.
This provides a huge performance boost for pipelines that end in an
external command. Rough testing shows an improvement from roughly 400ms
to 30ms when `cat`-ing a large file.
2019-12-22 14:14:03 -03:30
f9da7f7d58 Merge pull request #1102 from jonathandturner/bump_nu
Bump nu version
2019-12-20 10:54:30 +13:00
367f11a62e Bump nu version 2019-12-20 09:03:54 +13:00
8a45ca9cc3 Merge pull request #1100 from nushell/fix-stable
Fix the stable plugins to correct list
2019-12-20 06:37:55 +13:00
e336930fd8 Update Cargo.toml 2019-12-20 06:18:06 +13:00
172ccc910e Fix the stable plugins to correct list 2019-12-20 06:01:42 +13:00
a8425daf14 Merge pull request #1097 from jonathandturner/fix_workspace
Fix the workspace I commented out
2019-12-18 10:14:16 -08:00
b629136528 Fix the workspace I commented out 2019-12-19 06:58:23 +13:00
91ebb7f718 Merge pull request #1096 from jonathandturner/copy_core_plugins
Copy core plugins back so we can publish
2019-12-18 08:54:31 -08:00
96484161c0 Copy core plugins back so we can publish 2019-12-19 05:35:17 +13:00
d21ddeeae6 Merge pull request #1094 from jonathandturner/rename_test_support
Rename test-support to nu-test-support
2019-12-17 11:08:24 -08:00
4322d373e6 More renames 2019-12-18 07:54:39 +13:00
08571392e6 Rename test-support to nu-test-support 2019-12-18 07:41:47 +13:00
f52235b1c1 Merge pull request #1093 from jonathandturner/fix_asset
Try to fix asset building
2019-12-17 10:28:15 -08:00
a66147da47 Try to fix asset building 2019-12-18 07:09:38 +13:00
df778afd1f Try to fix asset building 2019-12-18 07:05:12 +13:00
d7ddaa376b Merge pull request #1092 from jonathandturner/oops
More oops
2019-12-17 09:11:52 -08:00
2ce892c6f0 More oops 2019-12-18 06:11:14 +13:00
28179ef450 Merge pull request #1091 from jonathandturner/add_descs
Oops
2019-12-17 09:09:30 -08:00
2c6336c806 Oops 2019-12-18 06:08:45 +13:00
761fc9ae73 Merge pull request #1090 from jonathandturner/add_descs
Add missing descriptions and licenses to subcrates
2019-12-17 09:07:36 -08:00
314c3c4a97 Add missing descriptions and licenses to subcrates 2019-12-18 06:07:00 +13:00
f7f1fba94f Merge pull request #1089 from jonathandturner/bump
Bump Nu version
2019-12-17 08:54:02 -08:00
14817ef229 Subcrate versions 2019-12-18 05:18:10 +13:00
98233dcec1 Subcrate versions 2019-12-18 05:09:53 +13:00
6540509911 Bump Nu version 2019-12-18 04:55:49 +13:00
594eae1cbc Merge pull request #1085 from andrasio/externals-line
$it can contain a string line or plain string data.
2019-12-16 17:42:49 -05:00
5e961815fc can contain a string line or plain string data. 2019-12-16 17:27:36 -05:00
fa9329c8e3 Merge pull request #1082 from sebastian-xyz/update-book-links
update links to books
2019-12-15 14:34:38 -08:00
6c577e18ca Merge pull request #1081 from andrasio/test-extract
Start test organization facelift.
2019-12-15 11:46:58 -05:00
4034129dba This commit is the continuing phase of extracting functionality to subcrates. We extract test helpers and begin to change Nu shell's test organization along with it. 2019-12-15 11:34:58 -05:00
52cf65c19e Merge pull request #1080 from andrasio/command-refactor
Separate internal and external command definitions.
2019-12-15 08:49:45 -05:00
cbbb246a6d update links to books 2019-12-15 13:56:26 +01:00
87cc6d6f01 Separate internal and external command definitions. 2019-12-15 01:24:31 -05:00
4b9ef5a9d0 Merge pull request #1079 from jonathandturner/bump_some_deps
Bump heim and necessary deps
2019-12-14 09:32:23 -08:00
31c703891a Bump heim and necessary deps 2019-12-15 02:27:14 +13:00
550bda477b Merge pull request #1060 from naufraghi/issues-972-expand-tilde-as-home-in-external-commands
Expand tilde as home in external commands
2019-12-13 08:46:08 -08:00
219b7e64cd Use shellexpand to expand ~ in external commands
Add tests for ~tilde expansion:

- test that "~" is expanded (no more "~" in output)
- ensure that "1~1" is not expanded to "1/home/user1" as it was
  before

Fixes #972

Note: the first test does not check the literal expansion because
the path on Windows is expanded as a Linux path, but the correct
expansion may come for free once `shellexpand` will use the `dirs`
crate too (https://github.com/netvl/shellexpand/issues/3).
2019-12-13 11:54:41 +01:00
98c59f77b2 Merge pull request #1078 from nushell/enable_coloring_in_tokens
Remove the coloring_in_tokens feature flag
2019-12-12 13:08:35 -08:00
e8800fdd0c Remove the coloring_in_tokens feature flag
Stabilize and enable
2019-12-12 11:34:43 -08:00
09f903c37a Merge pull request #1077 from nushell/implement-signature-syntax
Add Range and start Signature support
2019-12-11 21:58:09 -08:00
57af9b5040 Add Range and start Signature support
This commit contains two improvements:

- Support for a Range syntax (and a corresponding Range value)
- Work towards a signature syntax

Implementing the Range syntax resulted in cleaning up how operators in
the core syntax works. There are now two kinds of infix operators

- tight operators (`.` and `..`)
- loose operators

Tight operators may not be interspersed (`$it.left..$it.right` is a
syntax error). Loose operators require whitespace on both sides of the
operator, and can be arbitrarily interspersed. Precedence is left to
right in the core syntax.

Note that delimited syntax (like `( ... )` or `[ ... ]`) is a single
token node in the core syntax. A single token node can be parsed from
beginning to end in a context-free manner.

The rule for `.` is `<token node>.<member>`. The rule for `..` is
`<token node>..<token node>`.

Loose operators all have the same syntactic rule: `<token
node><space><loose op><space><token node>`.

The second aspect of this pull request is the beginning of support for a
signature syntax. Before implementing signatures, a necessary
prerequisite is for the core syntax to support multi-line programs.

That work establishes a few things:

- `;` and newlines are handled in the core grammar, and both count as
  "separators"
- line comments begin with `#` and continue until the end of the line

In this commit, multi-token productions in the core grammar can use
separators interchangably with spaces. However, I think we will
ultimately want a different rule preventing separators from occurring
before an infix operator, so that the end of a line is always
unambiguous. This would avoid gratuitous differences between modules and
repl usage.

We already effectively have this rule, because otherwise `x<newline> |
y` would be a single pipeline, but of course that wouldn't work.
2019-12-11 16:41:07 -08:00
16272b1b20 Merge pull request #1076 from jonathandturner/finish_plugin_refactor
Trying this as a workaround to the [[bin]] issue
2019-12-09 20:20:51 -08:00
1dcbd89a89 Trying this as a workaround to the [[bin]] issue 2019-12-10 16:57:55 +13:00
eb6ef02ad1 Merge pull request #1075 from jonathandturner/finish_plugin_refactor
Finish plugin refactor
2019-12-09 18:34:26 -08:00
17586bdfbd Fix missing dep 2019-12-10 15:13:22 +13:00
0e98cf3f1e Merge branch 'finish_plugin_refactor' of github.com:jonathandturner/nushell into finish_plugin_refactor 2019-12-10 13:59:44 +13:00
e2a95c3e1d Move str and inc to core plugins 2019-12-10 13:59:13 +13:00
5cb7df57fc Update azure-pipelines.yml 2019-12-10 13:09:25 +13:00
88f899d341 Move some plugins back to being core shippable plugins 2019-12-10 13:05:40 +13:00
7d70b5feda Try to fix CI with new subcrates 2019-12-10 08:14:58 +13:00
fd6ee03391 Remove old ValueExt 2019-12-10 07:52:01 +13:00
9f702fe01a Move the remainder of the plugins to crates 2019-12-10 07:39:51 +13:00
c9d9eec7f8 Merge pull request #1073 from jonathandturner/docker_wrap
Remove partial docker plugin. Embed->wrap
2019-12-08 21:08:03 -08:00
38cbfdb8a9 Remove partial docker plugin. Embed->wrap 2019-12-09 17:41:09 +13:00
f9b7376949 Merge pull request #1072 from jonathandturner/format_parse
Move format/parse to core commands
2019-12-08 18:26:35 -08:00
e98ed1b43d Move format/parse to core commands 2019-12-09 15:04:13 +13:00
251c3e103d Move format/parse to core commands 2019-12-09 14:57:53 +13:00
d26e938436 Merge pull request #1071 from jonathandturner/fix_1068
Fix 1068
2019-12-08 12:38:10 -08:00
dbadf9499e Fix 1068 2019-12-09 08:15:14 +13:00
28df1559ea Merge pull request #1070 from jonathandturner/upgrade_some_deps
Upgrade some dependencies
2019-12-08 10:19:39 -08:00
91784218c0 Upgrade some dependencies 2019-12-09 06:56:21 +13:00
eeec5e10c3 Merge pull request #1069 from jonathandturner/param_complete
Named param completion
2019-12-08 08:55:13 -08:00
0515ed976c Fix panic 2019-12-09 05:36:24 +13:00
f653992b4a A little cleanup 2019-12-08 19:42:43 +13:00
b5f8c1cc50 param completions work now 2019-12-08 19:23:31 +13:00
f9a46ce1e7 WIP param completions 2019-12-08 19:04:23 +13:00
b6ba7f97fd WIP param completions 2019-12-08 18:58:53 +13:00
7a47905f11 Merge pull request #1066 from thibran/fix-more-clippy-warnings
Fix more Clippy warnings
2019-12-07 16:10:36 -08:00
683f4c35d9 Fix more Clippy warnings
cargo clippy -- -W clippy::correctness
2019-12-07 21:04:58 +01:00
dfa5173cf4 Merge pull request #1064 from thibran/split-table-from-list
split format/table::from_list into multiple functions
2019-12-07 09:00:14 -08:00
04b214bef6 split format/table::from_list into multiple functions 2019-12-07 14:52:52 +01:00
37cb7fec77 Merge pull request #1063 from jonathandturner/unused_deps
Remove some unused deps
2019-12-06 23:44:52 -08:00
8833969e4a Remove some unused deps 2019-12-07 20:23:29 +13:00
bda238267c Merge pull request #1062 from jonathandturner/fetch_post
Fetch/post as plugins
2019-12-06 22:46:30 -08:00
d07dc57537 Add missing fallback case 2019-12-07 19:24:58 +13:00
d0a2888e88 Finish adding makeshift support for to fetch/post plugins 2019-12-07 17:23:59 +13:00
cec2eff933 Merge branch 'master' into fetch_post 2019-12-07 16:53:50 +13:00
38b7a3e32b WIP move post/fetch to plugins 2019-12-07 16:46:05 +13:00
9dfb6c023f Merge pull request #1061 from thibran/fix-most-clippy-warnings
Fix most Clippy performance warnings
2019-12-06 19:26:20 -08:00
cde92a9fb9 Fix most Clippy performance warnings
command used: cargo clippy -- -W clippy::perf
2019-12-06 23:25:47 +01:00
5622bbdd48 Merge pull request #1059 from coolshaurya/patch-1
Fix minor error in reject command docs
2019-12-06 08:13:55 -08:00
3d79a9c37a Fix minor error in reject command docs 2019-12-06 17:27:14 +05:30
a2a5b30568 Merge pull request #1058 from jonathandturner/edit_insert_core
Move edit and insert to core
2019-12-05 12:42:19 -08:00
768adb84a4 Remove commented out region 2019-12-06 09:19:24 +13:00
26b0250e22 Remove commented out region 2019-12-06 09:18:16 +13:00
6893850fce Move edit and insert to core 2019-12-06 09:15:41 +13:00
8834e6905e Merge pull request #1055 from jonathandturner/ps_sys_crates
Extract ps and sys subcrates. Move helper methods to UntaggedValue
2019-12-04 12:24:45 -08:00
1d5f13ddca formatting 2019-12-05 08:57:03 +13:00
d12c16a331 Extract ps and sys subcrates. Move helper methods to UntaggedValue 2019-12-05 08:52:31 +13:00
ecf47bb3ab Merge pull request #1054 from jonathandturner/binaryview_crate
Move binaryview to a sub-crate
2019-12-04 10:17:01 -08:00
a4bb5d4ff5 Move binaryview to a sub-crate 2019-12-05 06:51:20 +13:00
e9ee7bda46 Merge pull request #1052 from jonathandturner/fix_textview
Re-enable the textview plugin, now its own crate
2019-12-04 08:49:40 -08:00
1d196394f6 Merge pull request #1045 from sebastian-xyz/range
add range command
2019-12-04 08:37:03 -08:00
cfda67ff82 Finish making the textview plugin optional 2019-12-05 05:28:48 +13:00
59510a85d1 fix build warnings 2019-12-04 17:13:21 +01:00
35edf22ac3 Test all subcrates 2019-12-04 19:53:06 +13:00
871fc72892 Test all subcrates 2019-12-04 19:49:38 +13:00
1fcf671ca4 Re-enable the textview plugin, now its own crate 2019-12-04 19:38:40 +13:00
ecebe1314a update to new crates structure 2019-12-03 20:56:39 +01:00
bda5db59c8 Merge remote-tracking branch 'upstream/master' into range 2019-12-03 20:23:49 +01:00
4526d757b6 Merge pull request #1049 from andrasio/embed-list
embed as column when embedding a list
2019-12-03 02:51:58 -05:00
e5405d7f5c embed as column when embedding a list 2019-12-03 02:26:01 -05:00
201506a5ad add tests for range + run rustfmt 2019-12-03 08:24:49 +01:00
49f9253ca2 Merge pull request #1047 from jonathandturner/new_lines
Add new line primitive, bump version, allow bare filepaths
2019-12-02 23:14:08 -08:00
efc879b955 Add new line primitive, bump version, allow bare filepaths 2019-12-03 19:44:59 +13:00
3fa03eb7a4 Merge pull request #1046 from nushell/fix-external-words
Clean up expansion of external words
2019-12-02 17:12:50 -08:00
24bad78607 Clean up expansion of external words
Previously, external words accidentally used
ExpansionRule::new().allow_external_command(), when it should have been
ExpansionRule::new().allow_external_word().

External words are the broadest category in the parser, and are the
appropriate category for external arguments. This was just a mistake.
2019-12-02 16:34:33 -08:00
8de4c9dbb7 Merge pull request #1044 from nushell/protocol-extraction
Extract into crates
2019-12-02 14:29:04 -08:00
f858e854bf Fix a rebase mistake 2019-12-02 13:48:34 -08:00
87dbd3d5ac Extract build.rs 2019-12-02 13:14:51 -08:00
fe66b4c8ea Merge remote-tracking branch 'origin/master' into protocol-extraction 2019-12-02 11:16:00 -08:00
8390cc97e1 add range command 2019-12-02 20:15:14 +01:00
c0a7d4e2a7 Update .gitpod.yml 2019-12-02 11:02:59 -08:00
ce23a672d9 add documentation for compact command 2019-12-02 11:02:59 -08:00
9851317aeb add documentation for default command 2019-12-02 11:02:59 -08:00
3fb4a5d6e6 add documentation for format 2019-12-02 11:02:59 -08:00
340e701124 fix error in save.md 2019-12-02 11:02:59 -08:00
36938a4407 add documentation for save, config 2019-12-02 11:02:59 -08:00
6a6589a357 Update where.md 2019-12-02 11:02:59 -08:00
b94a32e523 add documentation for from-json, from-yaml, history, split-row 2019-12-02 11:02:59 -08:00
7db3c69984 update histogram, nth documentation 2019-12-02 11:02:59 -08:00
5406450c42 Add documentation for histogram, split-column 2019-12-02 11:02:59 -08:00
d6a6e16d21 Switch to the new Cargo.lock format
This was achieved by deleting Cargo.lock
and letting a recent Cargo nightly re-create
it. Support for the format was already
introduced in Rust 1.38, but currently,
stable releases of Cargo only retain it
if encountered but don't generate such
files by default.

The new format is smaller, better suited to
prevent merge conflicts and generates smaller
diffs at dependency updates, leading to
smaller git history.

You can read more about it in this PR: https://github.com/rust-lang/cargo/pull/7070
2019-12-02 11:02:59 -08:00
ea1b65916d Update Cargo.toml 2019-12-02 11:02:59 -08:00
cd9d9ad50b improve duration print 2019-12-02 11:02:58 -08:00
552272b37e replace and find-replace str plugin additions. 2019-12-02 11:02:58 -08:00
388ce738e3 expand tilde in externals 2019-12-02 11:02:58 -08:00
ef7fbcbe9f Update README.md 2019-12-02 11:02:58 -08:00
80941ace37 Add 0.6.1 release 2019-12-02 11:02:58 -08:00
f317500873 Update from-yaml.md 2019-12-02 11:02:58 -08:00
911414a190 Update config.md 2019-12-02 11:02:58 -08:00
cca6360bcc add documentation for from-tsv, from-xml 2019-12-02 11:02:58 -08:00
f68503fa21 add documentation for get, ps 2019-12-02 11:02:58 -08:00
911b69dff0 Update some command docs 2019-12-02 11:02:58 -08:00
4115634bfc Try to re-apply #1039 2019-12-02 11:02:58 -08:00
8a0bdde17a Remove env var from starship 2019-12-02 11:02:58 -08:00
a1e21828d6 Fix tests 2019-12-02 11:02:57 -08:00
0f193c2337 Update histogram.rs 2019-12-02 11:02:57 -08:00
526d94d862 improve duration print
original commit: ddb9d3a864
2019-12-02 11:02:57 -08:00
2fdafa52b1 replace and find-replace str plugin additions. 2019-12-02 11:02:57 -08:00
f52c0655c7 expand tilde in externals
original: 9f42d7693f
2019-12-02 11:02:57 -08:00
97331c7b25 Update README 2019-12-02 11:02:57 -08:00
1fb5a419a7 Bump the release version 2019-12-02 11:02:57 -08:00
4e9afd6698 Refactor classified.rs into separate modules.
Adds modules for internal, external, and dynamic commands, as well as
the pipeline functionality. These are exported as their old names from
the classified module so as to keep its "interface" the same.
2019-12-02 11:02:57 -08:00
8f9dd6516e Add =~ and !~ operators on strings
`left =~ right` return true if left contains right, using Rust's
`String::contains`. `!~` is the negated version.

A new `apply_operator` function is added which decouples evaluation from
`Value::compare`. This returns a `Value` and opens the door to
implementing `+` for example, though it wouldn't be useful immediately.

The `operator!` macro had to be changed slightly as it would choke on
`~` in arguments.
2019-12-02 11:02:57 -08:00
e4226def16 Extract core stuff into own crates
This commit extracts five new crates:

- nu-source, which contains the core source-code handling logic in Nu,
  including Text, Span, and also the pretty.rs-based debug logic
- nu-parser, which is the parser and expander logic
- nu-protocol, which is the bulk of the types and basic conveniences
  used by plugins
- nu-errors, which contains ShellError, ParseError and error handling
  conveniences
- nu-textview, which is the textview plugin extracted into a crate

One of the major consequences of this refactor is that it's no longer
possible to `impl X for Spanned<Y>` outside of the `nu-source` crate, so
a lot of types became more concrete (Value became a concrete type
instead of Spanned<Value>, for example).

This also turned a number of inherent methods in the main nu crate into
plain functions (impl Value {} became a bunch of functions in the
`value` namespace in `crate::data::value`).
2019-12-02 10:54:12 -08:00
c199a84dbb Merge pull request #1039 from thegedge/move-pipeline-execution-out-of-cli
Move pipeline execution code into classified::Pipeline
2019-12-01 19:47:34 -08:00
5a4ca11362 Merge pull request #1043 from JesterOrNot/master
install all features for nushell for gitpod
2019-12-01 18:32:15 -08:00
f2968c8385 Update .gitpod.yml 2019-12-01 17:16:53 -06:00
8d01b019f4 Merge pull request #1041 from tchak/docs-compact-default
document compact and default
2019-12-01 09:01:50 -08:00
bf87330d6e add documentation for compact command 2019-12-01 17:44:43 +01:00
2bb85bdbd4 add documentation for default command 2019-12-01 17:39:09 +01:00
8f34c6eeda Merge pull request #1032 from sebastian-xyz/doc
add documentation for save, config, get, ps, from-tsv, from-xml
2019-11-30 18:15:39 -08:00
ac5543bad9 Move pipeline execution code into classified::Pipeline 2019-11-30 16:12:34 -05:00
e4c56a25c6 Merge remote-tracking branch 'refs/remotes/origin/doc' into doc 2019-11-30 21:21:15 +01:00
11ff8190b1 add documentation for format 2019-11-30 21:15:12 +01:00
9bd25d7427 fix error in save.md 2019-11-30 21:07:43 +01:00
5676713b1f Update README.md 2019-12-01 07:12:14 +13:00
b59231d32b Merge pull request #1035 from jonathandturner/bump_to_0_6_1
Add 0.6.1 release
2019-11-30 10:11:31 -08:00
e530cf0a9d Add 0.6.1 release 2019-12-01 07:10:51 +13:00
6bfb4207c4 Update from-yaml.md 2019-12-01 07:00:36 +13:00
c63ad610f5 Update config.md 2019-12-01 06:59:53 +13:00
e38a4323b4 add documentation for from-tsv, from-xml 2019-11-30 13:38:52 +01:00
d40aea5d0a add documentation for get, ps 2019-11-30 12:48:23 +01:00
1ba69e4b11 Merge pull request #1030 from jonathandturner/more_doc_updates
Update some command docs
2019-11-29 17:53:02 -08:00
f10390b1be Update some command docs 2019-11-30 14:24:39 +13:00
c2b1908644 Merge pull request #1029 from jonathandturner/fix_starship_env_var
Remove env var from starship
2019-11-29 12:00:10 -08:00
0a93335f6d Remove env var from starship 2019-11-30 08:38:44 +13:00
fbb65cde44 add documentation for save, config 2019-11-29 18:15:51 +01:00
8e7acd1094 Update where.md 2019-11-29 08:41:27 +13:00
c6ee6273db Merge pull request #1015 from sebastian-xyz/doc
Add documentation for histogram, split-column
2019-11-28 11:19:36 -08:00
c77059f891 add documentation for from-json, from-yaml, history, split-row 2019-11-28 19:33:17 +01:00
5bdda06ca6 update histogram, nth documentation 2019-11-28 19:32:31 +01:00
d8303dd6d6 Merge pull request #1026 from est31/new-cargo-lock
Switch to the new Cargo.lock format
2019-11-27 19:11:54 -08:00
60ec68b097 Switch to the new Cargo.lock format
This was achieved by deleting Cargo.lock
and letting a recent Cargo nightly re-create
it. Support for the format was already
introduced in Rust 1.38, but currently,
stable releases of Cargo only retain it
if encountered but don't generate such
files by default.

The new format is smaller, better suited to
prevent merge conflicts and generates smaller
diffs at dependency updates, leading to
smaller git history.

You can read more about it in this PR: https://github.com/rust-lang/cargo/pull/7070
2019-11-28 03:27:37 +01:00
deae66c194 Cargo update
This performs a cargo update to allow the upcoming commit
that switches to the new Cargo.lock format to be only about
that format change.
2019-11-28 03:17:31 +01:00
0bdb6e735a Update Cargo.toml 2019-11-27 17:14:45 +13:00
7933e01e77 Update Cargo.toml 2019-11-27 15:55:02 +13:00
b443a2d713 Merge pull request #1017 from jonathandturner/better_duration
improve duration print
2019-11-27 15:32:17 +13:00
7a28ababd1 Update histogram.rs 2019-11-27 15:32:05 +13:00
ddb9d3a864 improve duration print 2019-11-27 15:07:55 +13:00
186b75a848 Merge pull request #1016 from andrasio/str
replace and find-replace str plugin additions.
2019-11-26 19:29:16 -05:00
8cedd2ee5b replace and find-replace str plugin additions. 2019-11-26 19:03:22 -05:00
0845572878 Add documentation for histogram, split-column 2019-11-26 20:47:34 +01:00
2e4b0b0b17 Merge pull request #1014 from jonathandturner/fix_1013
expand tilde in externals
2019-11-27 06:52:30 +13:00
9f42d7693f expand tilde in externals 2019-11-27 06:34:02 +13:00
3424334ce5 Merge pull request #1012 from jonathandturner/bump_release_version
Bump release version
2019-11-26 21:21:33 +13:00
c68d236fd7 Update README 2019-11-26 21:00:34 +13:00
7c6e82c990 Bump the release version 2019-11-26 20:59:43 +13:00
eb5d0d295b Merge pull request #1009 from nushell/cleanup-wip
Extract nu_source into a crate
2019-11-25 19:54:22 -08:00
2eae5a2a89 Merge remote-tracking branch 'origin/master' into cleanup-wip 2019-11-25 19:25:12 -08:00
595c9f2999 Merge branch 'master' into cleanup-wip 2019-11-25 18:32:24 -08:00
70d63e34e9 Merge pull request #1008 from thegedge/move-pipeline-to-classified
Move pipeline code from cli to classified
2019-11-25 18:21:07 -05:00
83ac65ced3 Merge pull request #997 from bndbsh/operator-contains
Add `=~` and `!~` operators on strings
2019-11-25 18:19:58 -05:00
be140382cf Merge pull request #1011 from andrasio/nth-checks
nth can select more than one row at a time.
2019-11-25 17:55:33 -05:00
d320ffe742 nth can select more than one row at a time. 2019-11-25 17:16:58 -05:00
fbc6f01cfb Add =~ and !~ operators on strings
`left =~ right` return true if left contains right, using Rust's
`String::contains`. `!~` is the negated version.

A new `apply_operator` function is added which decouples evaluation from
`Value::compare`. This returns a `Value` and opens the door to
implementing `+` for example, though it wouldn't be useful immediately.

The `operator!` macro had to be changed slightly as it would choke on
`~` in arguments.
2019-11-25 15:06:11 -05:00
3008434c0f Eliminate repetitive code and fix Unix failure 2019-11-25 11:09:59 -08:00
5fbea31d15 Remove unused Display implementations
After the previous commit, nushell uses PrettyDebug and
PrettyDebugWithSource for our pretty-printed display output.

PrettyDebug produces a structured `pretty.rs` document rather than
writing directly into a fmt::Formatter, and types that implement
`PrettyDebug` have a convenience `display` method that produces a string
(to be used in situations where `Display` is needed for compatibility
with other traits, or where simple rendering is appropriate).
2019-11-25 10:07:20 -08:00
f70c6d5d48 Extract nu_source into a crate
This commit extracts Tag, Span, Text, as well as source-related debug
facilities into a new crate called nu_source.

This change is much bigger than one might have expected because the
previous code relied heavily on implementing inherent methods on
`Tagged<T>` and `Spanned<T>`, which is no longer possible.

As a result, this change creates more concrete types instead of using
`Tagged<T>`. One notable example: Tagged<Value> became Value, and Value
became UntaggedValue.

This change clarifies the intent of the code in many places, but it does
make it a big change.
2019-11-25 07:37:33 -08:00
71e7eb7cfc Move all pipeline execution code from cli to classified::pipeline 2019-11-24 22:52:37 -05:00
339ec46961 Refactor classified.rs into separate modules.
Adds modules for internal, external, and dynamic commands, as well as
the pipeline functionality. These are exported as their old names from
the classified module so as to keep its "interface" the same.
2019-11-24 17:19:12 -05:00
fe53c37654 Merge pull request #1006 from andrasio/additions
Default.
2019-11-24 04:55:12 -05:00
06857fbc52 Take all rows having the column present. 2019-11-24 04:35:36 -05:00
1c830b5c95 default command introduced. 2019-11-24 04:20:08 -05:00
a74145961e Always check the row's columns. 2019-11-24 01:25:41 -05:00
91698b2657 Merge pull request #1003 from andrasio/compact
Compact.
2019-11-23 22:03:20 -05:00
40fd8070a9 Merge pull request #1004 from jonathandturner/revert_some_table_changes
Revert some of the recent styled string changes
2019-11-24 14:28:58 +13:00
4d5f1f6023 Revert some of the recent styled string changes 2019-11-24 13:56:19 +13:00
bc2d65cd2e Remove raw data debugging. 2019-11-23 19:16:25 -05:00
1a0b339897 compact command introduced. 2019-11-23 19:05:44 -05:00
8d3a937413 Display raw debugging data (rust represetantion). 2019-11-23 18:53:50 -05:00
e85e1b2c9e Merge pull request #986 from nushell/int-columns
Integer columns and better debug infra
2019-11-22 09:07:03 -08:00
c8aa8cb842 debug command facelift. 2019-11-22 03:31:58 -05:00
88c4473283 Remove fuzzysearch. 2019-11-22 03:25:09 -05:00
f4d9975dab Clean up feature build flags. 2019-11-22 03:11:36 -05:00
6e8b768d79 Requiring at least one member is no longer necessary. 2019-11-22 01:18:06 -05:00
cdb0eeafa2 --no-edit 2019-11-21 14:22:32 -08:00
388fc24191 Merge pull request #990 from drmason13/combine-csv-and-tsv
combine functions behind to/from-c/tsv commands
2019-11-19 11:29:33 -05:00
b3c021899c combine functions behind to/from-c/tsv commands
fixes #969, admittedly without a --delimiter alias

moves from_structured_data.rs to from_delimited_data.rs to better
identify its scope and adds to_delimited_data.rs. Now csv and tsv both
use the same code, tsv passes in a fixed '\t' argument where csv passes
in the value of --separator
2019-11-19 16:02:35 +00:00
bff50c6987 Merge pull request #988 from jonathandturner/umask
Add umask to unix --full list
2019-11-19 21:10:15 +13:00
111fcf188e Add umask to unix --full list 2019-11-19 18:46:47 +13:00
015693aea7 Update README.md 2019-11-19 03:41:16 +13:00
03a52f1988 Merge pull request #984 from nushell/latest_nightly
Fix build errors on latest nightly
2019-11-18 16:33:46 +13:00
372f6c16b3 Fix build errors on latest nightly 2019-11-18 16:12:37 +13:00
c04da4c232 Merge pull request #982 from Aloso/master
Format durations nicely
2019-11-18 11:49:58 +13:00
a070cb8154 Format durations nicely 2019-11-17 22:51:56 +01:00
bf4273776f Merge pull request #980 from jonathandturner/remove_fuzzy_search
Remove fuzzy search because of compat issues
2019-11-18 08:22:45 +13:00
95ca3ed4fa Remove fuzzy search because of compat issues 2019-11-18 08:01:17 +13:00
54c0603263 Merge pull request #979 from jonathandturner/abbrev_ls
Abbreviate ls by default, add --full flag
2019-11-18 07:06:19 +13:00
c598cd4255 Fix tests 2019-11-18 06:38:44 +13:00
2bb03d9813 Abbreviate ls by default, add --full flag 2019-11-18 06:10:50 +13:00
9c41f581a9 Merge pull request #978 from jonathandturner/duration_primitive
Make duration its own primitive
2019-11-17 19:07:51 +13:00
6231367bc8 Make duration its own primitive 2019-11-17 18:48:48 +13:00
a7d7098b1a Merge pull request #977 from jonathandturner/from_xls
Add from-xlsx for importing excel files
2019-11-17 16:36:22 +13:00
90aeb700ea Add from_xlsx for importing excel files 2019-11-17 16:18:41 +13:00
9dfc647386 Merge pull request #976 from bndbsh/save-error
Improve error messages for save
2019-11-17 14:58:55 +13:00
f992f5de95 Update save.rs 2019-11-17 14:13:52 +13:00
946f7256e4 Improve error messages for save
`save` attempts to convert input based on the target filename extension,
and expects a stream of text otherwise. However the error message is
unclear and provides little guidance, hopefully this is less confusing
to new users.

It might be worthwhile to also add a hint about adding an extension,
though I'm not sure if it's possible to emit multiple diagnostics.
2019-11-16 19:08:38 -05:00
57d425d929 Merge pull request #975 from jonathandturner/process_prompt_once
Process prompts once rather than twice
2019-11-17 10:22:49 +13:00
dd36bf07f4 Process prompts once rather than twice 2019-11-17 09:42:35 +13:00
406fb8d1d9 Merge pull request #973 from jonathandturner/fix_windows_starship
Give rustyline non-ansi to begin with. Fixes starship in windows
2019-11-17 09:25:45 +13:00
2d4a225e2a Fix formatting 2019-11-17 09:06:00 +13:00
db218e06dc Give rustyline non-ansi to begin with. Fixes Windows 2019-11-17 09:02:26 +13:00
17e8a5ce38 Merge pull request #970 from jonathandturner/starship-prompt
Starship prompt
2019-11-17 06:43:59 +13:00
07db14f72e Merge master 2019-11-17 06:17:05 +13:00
412831cb9c Merge pull request #968 from sebastian-xyz/patch-4
add group-by command documentation
2019-11-17 05:59:41 +13:00
f4dc79f4ba add group-by command documentation 2019-11-16 15:31:28 +01:00
9cb573b3b4 Merge pull request #967 from jonathandturner/fix_warning
Fix build warnings
2019-11-16 22:05:28 +13:00
ce106bfda9 Fix build warnings 2019-11-16 21:23:04 +13:00
a3ffc4baf0 Merge pull request #966 from jonathandturner/duration_comparison
Add comparison between dates
2019-11-16 15:03:12 +13:00
3c3637b674 Add comparison between dates 2019-11-16 14:36:51 +13:00
bcecd08825 Merge pull request #965 from sebastian-xyz/patch-3
Add prepend command documentation
2019-11-16 06:19:14 +13:00
55f99073ad Merge pull request #964 from sebastian-xyz/patch-1
Add append command documentation
2019-11-16 06:18:35 +13:00
008c60651c Merge pull request #961 from rtlechow/patch-1
Document pivot command
2019-11-16 06:14:43 +13:00
63667d9e46 Add prepend command documentation 2019-11-15 15:53:58 +01:00
08b770719c Add append command documentation 2019-11-15 15:37:41 +01:00
e0d27ebf84 Merge pull request #960 from uma0317/master
Fix move file to diffrent partition on Windows
2019-11-15 00:39:00 -05:00
0756145caf Fix move file to diffrent partition on Windows 2019-11-15 11:52:51 +09:00
036860770b Document pivot command
Part of https://github.com/nushell/nushell/issues/711
2019-11-14 16:59:39 -05:00
aa1ef39da3 Merge pull request #916 from t-hart/pr/from-tsv-csv-headerless
Make --headerless treat first row as data
2019-11-14 05:34:49 +13:00
7c8969d4ea Merge pull request #957 from nushell/futures-codec-dgrade
Downgrade futures-codec.
2019-11-12 14:49:24 -05:00
87d58535ff Downgrade futures-codec. 2019-11-12 14:04:53 -05:00
1060ba2206 Fixes --headerless functionality for from-ssv.
Squashed commit of the following:

commit fc59d47a2291461d84e0587fc0fe63af0dc26f9f
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Tue Nov 12 15:39:38 2019 +0100

    Fixes inconsistencies in output.

commit da4084e9fdd983557b101207b381e333a443e551
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Tue Nov 12 13:04:10 2019 +0100

    remove unused enum.

commit 7f6a105879c8746786b99fb19bb9f0860c41796a
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Tue Nov 12 12:58:41 2019 +0100

    Starts refactoring from_ssv.

commit b70ddd169ef0c900e03fb590cb171cc7181528db
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Tue Nov 12 11:34:06 2019 +0100

    Fixes --headerless for non-aligned columns.

commit 6332778dd26de8d07be77b291124115141479892
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Tue Nov 12 10:27:35 2019 +0100

    Fixes from-ssv headerless aligned-columns logic.

commit 747d8c812e06349b4a15b8c130721881d86fff98
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Mon Nov 11 23:53:59 2019 +0100

    fixes unit tests for ssv.

commit c77cb451623b37a7a9742c791a4fc38cad053d3d
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Mon Nov 11 22:49:21 2019 +0100

    it compiles! one broken test.

commit 08a05964f56cf92507c255057d0aaf2b6dbb6f45
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Mon Nov 11 18:52:54 2019 +0100

    Backed into a corner. Help.

commit c95ab683025a8007b8a6f8e1659f021a002df584
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Mon Nov 11 17:30:54 2019 +0100

    broken but on the way
2019-11-12 16:04:55 +01:00
0401087175 Refactors out structured parsing logic to a separate module. 2019-11-12 16:04:55 +01:00
f8dc06ef49 Changes implementation of --headerless for from-tsv. 2019-11-12 16:04:55 +01:00
282cb46ff1 Implements --headerless for from-csv 2019-11-12 16:04:55 +01:00
a3ff5f1246 Updates tests for from tsv, csv, and ssv.
With the proposed changes, these tests now become invalid. If the first line is
to be counted as data, then converting the headers to ints will fail. Removing
the headers and instead treating the first line as data, however, reflects the
new, desired mode of operation.
2019-11-12 16:04:55 +01:00
5bb822dcd4 Merge pull request #954 from andrasio/reduce
Expose histogram and split-by command.
2019-11-12 04:10:37 -05:00
00b3c2036a This is part of on-going work with capabilities when working with
tables and able to work with them for data processing & viewing
purposes. At the moment, certain ways to process said tables we
are able to view a histogram of a given column.

As usage matures, we may find certain core commands that could
be used ergonomically when working with tables on Nu.
2019-11-12 03:39:30 -05:00
3163b0d362 Data processing mvp histogram. 2019-11-12 02:08:28 -05:00
21f48577ae Reductions placeholder. 2019-11-12 02:08:28 -05:00
11e4410d1c Merge pull request #806 from DrSensor/ci/github/quay.io
ci(github): replace docker.pkg.github.com with quay.io
2019-11-12 12:03:00 +13:00
27a950d28e Merge pull request #952 from JesterOrNot/master
edit install cmd
2019-11-11 14:40:09 -08:00
f3d056110a DOCKER_USER should come from secrets 2019-11-11 13:33:52 -05:00
b39c2e2f75 edit install cmd 2019-11-11 18:17:55 +00:00
7cf3c6eb95 Move env declaration to jobs.docker 2019-11-11 07:51:41 +07:00
cdec0254ec Merge pull request #951 from jonathandturner/bump_deps
Bump dep versions
2019-11-10 10:15:18 -08:00
02f3330812 Merge pull request #950 from coolshaurya/docs-size
Make documentation for size command
2019-11-10 09:52:43 -08:00
6f013d0225 Merge pull request #949 from coolshaurya/docs-count
Add docs for the count command
2019-11-10 09:51:25 -08:00
1f06f57de3 Merge pull request #948 from coolshaurya/docs-pick-reject
Add documentation for the pick and reject command
2019-11-10 09:50:29 -08:00
0f405f24c7 Bump dep versions 2019-11-11 06:48:49 +13:00
5a8128dd30 Make documentation for size command 2019-11-10 14:41:23 +05:30
50616cc62c Add docs for the count command
Partial fix of issue #711
2019-11-10 14:12:59 +05:30
9d345cab07 Add documentation for the pick and reject command
Partial fix of issue#711
2019-11-10 12:37:27 +05:30
59ab11e932 Merge pull request #947 from jonathandturner/bump_and_plugin_load
Bump Nu version and change plugin load logic for debug
2019-11-09 21:29:09 -08:00
df302d4bac Bump Nu version and change plugin load logic for debug 2019-11-10 16:44:05 +13:00
6bbfd0f4f6 Merge pull request #945 from jonathandturner/format
Format
2019-11-09 16:39:51 -08:00
943e0045e7 Update readme 2019-11-10 13:16:52 +13:00
62a5250554 Add format command 2019-11-10 13:14:59 +13:00
9043970e97 Merge pull request #943 from drmason13/from_csv-add-separator-arg
Add --separator argument to from_csv
2019-11-09 15:09:32 -08:00
d32c9ce1b6 Merge pull request #944 from jonathandturner/read_to_parse
Read to parse
2019-11-09 15:07:40 -08:00
73d8478678 Update readme 2019-11-10 11:27:56 +13:00
bab58576b4 Rename read to parse 2019-11-10 11:26:44 +13:00
41212c1ad1 Merge pull request #942 from BurNiinTRee/master
removed the requirement on the 'regex' feature for the match plugin
2019-11-08 09:11:36 -08:00
4a6122905b fmt: cargo fmt --all 2019-11-08 15:27:29 +00:00
15986c598a Add --separator command to from_csv
The command takes a string, checks it is a single character and then
passes it to csv::ReaderBuilder via .delimiter() method as a u8.
2019-11-08 15:06:33 +00:00
078342442d removed the requirement on the 'regex' feature for the match plugin
The nu_plugin_match binary wasn't built anymore
after the regex dependency was made non-optional in
https://github.com/nushell/nushell/pull/889, causing
the removal of the regex feature, which nu_plugin_match
depended on.
2019-11-08 13:33:28 +01:00
8855c54391 Update README.md 2019-11-08 08:19:41 +13:00
5dfc81a157 Merge pull request #940 from jonathandturner/stable_v2
Second attempt to remove rust-toolchain
2019-11-07 11:18:50 -08:00
c42d97fb97 try again 2019-11-08 08:00:46 +13:00
13314ad1e7 try again 2019-11-08 07:54:52 +13:00
ff6026ca79 try again 2019-11-08 07:47:43 +13:00
c6c6c0f295 try again 2019-11-08 07:44:34 +13:00
1cca5557b1 Second attempt to remove rust-toolchain 2019-11-08 07:27:39 +13:00
76208110b9 Update README.md 2019-11-08 07:17:12 +13:00
56dd0282f0 Merge pull request #938 from nushell/jonathandturner-patch-1
Update docker to stable
2019-11-07 09:59:53 -08:00
c01b602b86 Update docker to stable 2019-11-08 06:34:53 +13:00
d6f46236e9 Merge pull request #937 from nushell/jonathandturner-patch-1
Move azure pipeline to stable
2019-11-07 09:32:54 -08:00
104b30142f Move azure pipeline to stable 2019-11-08 06:13:39 +13:00
f3a885d920 Merge pull request #936 from nushell/move-to-stable
Move Nu to the stable Rust 1.39 release
2019-11-07 09:11:18 -08:00
60445b0559 Move Nu to the stable Rust 1.39 release 2019-11-08 05:51:21 +13:00
01d6287a8f Update README.md 2019-11-06 18:25:23 +13:00
0462b2db80 Merge pull request #925 from jonathandturner/bump_to_0_5_0
Bump version to 0.5.0
2019-11-06 18:24:45 +13:00
4cb399ed70 Bump version to 0.5.0 2019-11-06 18:24:04 +13:00
7ef9f7702f Merge pull request #924 from jonathandturner/help_flags_last
Move flags help to last
2019-11-06 15:50:25 +13:00
44a1686a76 Move flags help to last 2019-11-06 15:28:26 +13:00
15c6d24178 Merge pull request #919 from JesterOrNot/master
Update .gitpod.yml to install nu rather than just build!
2019-11-05 07:24:20 +13:00
3b84e3ccfe Update .gitpod.yml 2019-11-04 11:44:56 -06:00
da7d6beb22 Merge pull request #917 from thegedge/eliminate-is-first-command
Eliminate is_first_command by defaulting to Value::nothing()
2019-11-05 06:34:33 +13:00
f012eb7bdd Eliminate is_first_command by defaulting to Value::nothing() 2019-11-03 20:06:59 -05:00
f966394b63 Merge pull request #888 from andrasio/data-primitives
WIP [data processing]
2019-11-03 16:52:21 -05:00
889d2bb378 Isolate feature. 2019-11-03 16:36:47 -05:00
a2c4e485ba Merge pull request #914 from andrasio/column_path-semantic-fix
ColumnPaths should expect members encapsulated as members.
2019-11-03 06:56:19 -05:00
8860d8de8d At the moment, ColumnPaths represent a set of Members (eg. package.authors is a column path of two members)
The functions for retrieving, replacing, and inserting values into values all assumed they get the complete
column path as regular tagged strings. This commit changes for these to accept a tagged values instead. Basically
it means we can have column paths containing strings and numbers (eg. package.authors.1)

Unfortunately, for the moment all members when parsed and deserialized for a command that expects column paths
of tagged values will get tagged values (encapsulating Members) as strings only.

This makes it impossible to determine whether package.authors.1 package.authors."1" (meaning the "number" 1) is
a string member or a number member and thus prevents to know and force the user that paths enclosed in double
quotes means "retrieve the column at this given table" and that numbers are for retrieving a particular row number
from a table.

This commit sets in place the infraestructure needed when integer members land, in the mean time the workaround
is to convert back to strings the tagged values passed from the column paths.
2019-11-03 06:30:32 -05:00
d7b768ee9f Fallback internally to String primitives until Member int serialization lands. 2019-11-03 05:38:47 -05:00
6ea8e42331 Move column paths to support broader value types. 2019-11-03 05:38:47 -05:00
1b784cb77a Merge pull request #913 from andrasio/tests-builtins
`get` preserves anchored inputs.
2019-11-03 05:11:09 -05:00
4a0ec1207c Preserve anchored meta data for all get queries in the pipeline 2019-11-03 03:49:06 -05:00
ffb2fedca9 Update README.md 2019-11-03 18:24:11 +13:00
382b1ba85f Merge pull request #912 from jonathandturner/fix_910
Make column logic in from-ssv optional
2019-11-03 17:31:15 +13:00
3b42655b51 Make column logic in from-ssv optional 2019-11-03 17:04:59 +13:00
e43e906f86 Update README.md 2019-11-03 16:13:00 +13:00
e51d9d0935 Update README.md 2019-11-03 16:12:36 +13:00
f57489ed92 get command tests already present and move to their own. 2019-11-02 21:05:27 -05:00
503e521820 Merge pull request #909 from jonathandturner/config_set_into
Add support for config --set_into
2019-11-03 13:06:58 +13:00
c317094947 Add support for config --set_into 2019-11-03 12:43:15 +13:00
243df63978 Move config to async_stream 2019-11-03 12:22:30 +13:00
05ff102e09 Merge pull request #908 from jonathandturner/fix_907
Fix 907 and improve substring
2019-11-03 09:14:13 +13:00
cd30fac050 Approach fix differently 2019-11-03 08:57:28 +13:00
f589d3c795 Fix 907 and improve substring 2019-11-03 07:49:28 +13:00
51879d022e Merge pull request #895 from Flare576/substring
Adds new substring function to str plugin
2019-11-02 17:42:45 +13:00
2260b3dda3 Update str.rs 2019-11-02 17:25:20 +13:00
aa64442453 Merge pull request #906 from jonathandturner/nu_env_vars
Add initial support for env vars
2019-11-02 17:12:13 +13:00
129ee45944 Add initial support for env vars 2019-11-02 16:41:58 +13:00
2fe7d105b0 Merge pull request #905 from jonathandturner/add_to_insert
Rename add to insert
2019-11-02 15:07:41 +13:00
136c8acba6 Update README 2019-11-02 14:48:18 +13:00
e92d4b2ccb Rename add to insert 2019-11-02 14:47:14 +13:00
6e91c96dd7 Merge pull request #904 from jonathandturner/plugin_nu_path
Use nu:path for plugin loading
2019-11-02 14:12:51 +13:00
7801c03e2d plugin_nu_path 2019-11-02 13:36:21 +13:00
763bbe1c01 Updated Doc, error on bad input 2019-11-01 17:25:08 -05:00
0f67569cc3 Merge branch 'master' of github.com:nushell/nushell 2019-11-01 13:35:20 -07:00
0ea3527544 Update issue templates 2019-11-02 09:21:29 +13:00
20dfca073f Merge pull request #902 from jonathandturner/updated_echo
Make echo more flexible with data types
2019-11-02 08:34:20 +13:00
a3679f0f4e Make echo more flexible with data types 2019-11-02 08:15:53 +13:00
e75fdc2865 Merge pull request #897 from nushell/modernize_external_tokens
Modernize external tokens
2019-11-02 06:18:38 +13:00
4be88ff572 Modernize external parse and improve trace
The original purpose of this PR was to modernize the external parser to
use the new Shape system.

This commit does include some of that change, but a more important
aspect of this change is an improvement to the expansion trace.

Previous commit 6a7c00ea adding trace infrastructure to the syntax coloring
feature. This commit adds tracing to the expander.

The bulk of that work, in addition to the tree builder logic, was an
overhaul of the formatter traits to make them more general purpose, and
more structured.

Some highlights:

- `ToDebug` was split into two traits (`ToDebug` and `DebugFormat`)
  because implementations needed to become objects, but a convenience
  method on `ToDebug` didn't qualify
- `DebugFormat`'s `fmt_debug` method now takes a `DebugFormatter` rather
  than a standard formatter, and `DebugFormatter` has a new (but still
  limited) facility for structured formatting.
- Implementations of `ExpandSyntax` need to produce output that
  implements `DebugFormat`.

Unlike the highlighter changes, these changes are fairly focused in the
trace output, so these changes aren't behind a flag.
2019-11-01 08:45:45 -07:00
992789af26 Merge pull request #899 from loksonarius/document-tags-command
Add documentation for tags command
2019-11-01 21:25:55 +13:00
b822e13f12 Add documentation for tags command 2019-11-01 00:08:24 -04:00
cd058db046 Substring option for str plugin
Adds new substr function to str plugin with tests and documentation

Function takes a start/end location as a string in the form "##,##", both sides of comma are optional, and
behaves like Rust's own index operator [##..##].
2019-10-31 19:49:17 -05:00
1b3143d3d4 Merge pull request #898 from andrasio/numbers-are-valid-column-names
get :: support fetching rows from tables using column paths named as numbers.
2019-10-31 14:43:41 -05:00
e31ed66610 get :: support fetching rows using numbers in column path. 2019-10-31 14:20:22 -05:00
7f18ff10b2 Merge pull request #892 from andrasio/column_path-fetch-table
Value operations and error handling separation.
2019-10-31 05:31:16 -05:00
65ae24fbf1 suite in place. 2019-10-31 04:42:18 -05:00
b54ce921dd Better error messages. 2019-10-31 04:36:08 -05:00
4935129c5a Merge branch 'master' of github.com:nushell/nushell 2019-10-30 18:40:34 -07:00
7614ce4b49 Allow handling errors with failure callbacks. 2019-10-30 17:46:40 -05:00
9d34ec9153 Merge pull request #891 from nushell/jonathandturner-patch-1
Move rustyline dep back to crates
2019-10-31 09:30:30 +13:00
fd92271884 Move rustyline dep back to crates 2019-10-31 09:14:47 +13:00
cea8fab307 "Integers" in column paths fetch a row from a table. 2019-10-30 05:55:26 -05:00
2d44b7d296 Update README.md 2019-10-30 20:22:01 +13:00
faccb0627f Merge pull request #890 from jonathandturner/append_prepend
Add prepend and append commands
2019-10-30 20:20:06 +13:00
a9cd6b4f7a Format files 2019-10-30 20:04:39 +13:00
81691e07c6 Add prepend and append commands 2019-10-30 19:54:06 +13:00
26f40dcabc Merge pull request #889 from jonathandturner/read_plugin
Add a simple read/parse plugin to better handle text data
2019-10-30 12:08:28 +13:00
3820fef801 Add a simple read/parse plugin to better handle text data 2019-10-30 11:33:36 +13:00
392ff286b2 This commit is ongoing work for making Nu working with data processing
a joy. Fundamentally we embrace functional programming principles for
transforming the dataset from any format picked up by Nu. This table
processing "primitive" commands will build up and make pipelines
composable with data processing capabilities allowing us the valuate,
reduce, and map, the tables as far as even composing this declartively.

On this regard, `split-by` expects some table with grouped data and we
can use it further in interesting ways (Eg. collecting labels for
visualizing the data in charts and/or suit it for a particular chart
of our interest).
2019-10-29 16:04:31 -05:00
b6824d8b88 Merge pull request #886 from notryanb/fetch-from-variable
WIP fetch command - support reading url from variable
2019-10-29 13:52:35 +13:00
e09160e80d add ability to create PathBuf from string to avoid type mismatch 2019-10-28 20:22:51 -04:00
8ba5388438 Merge pull request #885 from jonathandturner/update_path
Allow updating PATH in config
2019-10-29 11:38:53 +13:00
30b6eac03d Allow updating path in config 2019-10-29 10:22:31 +13:00
17ad07ce27 Merge pull request #884 from jonathandturner/nu_path_var
Add support for $nu:path
2019-10-29 08:23:02 +13:00
53911ebecd Add support for :path 2019-10-29 07:40:34 +13:00
bc309705a9 Merge pull request #883 from jonathandturner/magic_env_vars
Add support for $nu:config and $nu:env
2019-10-29 07:22:44 +13:00
1de80aeac3 Add support for :config and :env 2019-10-29 06:51:08 +13:00
1eaaf368ee Merge pull request #879 from andrasio/tilde-pattern
Expand tilde in patterns.
2019-10-28 12:09:02 -05:00
36e40ebb85 Merge pull request #882 from jonathandturner/arg_descs
Add descriptions to arguments
2019-10-28 18:47:56 +13:00
3f600c5b82 Fix build issues 2019-10-28 18:30:14 +13:00
fbd980f8b0 Add descriptions to arguments 2019-10-28 18:15:35 +13:00
7d383421c6 Merge pull request #881 from jonathandturner/history
Always save history, add history command
2019-10-28 06:36:54 +13:00
aed386b3cd Always save history, add history command 2019-10-28 05:58:39 +13:00
540cc4016e Expand tilde in patterns. 2019-10-27 03:55:30 -05:00
1b3a09495d Merge pull request #874 from andrasio/move-out-tag
Move out tags when parsing and building tree nodes.
2019-10-25 22:09:39 -05:00
b7af34371b Merge pull request #871 from oknozor/master
Create docs for from-csv command
2019-10-25 21:36:38 -05:00
105762e1c3 Merge pull request #873 from oknozor/doc/from-toml
Create docs for from-toml command
2019-10-25 21:35:28 -05:00
2706ae076d Move out tags when parsing and building tree nodes. 2019-10-25 18:31:25 -05:00
07ceec3e0b Create docs for from-toml command
Partial fix of issue nushell#711
2019-10-25 20:47:00 +02:00
72fd1b047f Create docs for from-csv command
Partial fix of issue nushell#711
2019-10-25 20:40:51 +02:00
178b6d4d8d Merge pull request #870 from jonathandturner/rusty
rustyline git and add plus for filenames
2019-10-26 06:15:45 +13:00
d160e834eb rustyline git and add plus for filenames 2019-10-26 05:43:31 +13:00
3e8b9e7e8b Merge pull request #867 from nushell/bump
Bump version
2019-10-23 21:15:53 +13:00
c34ebfe739 Bump version
Bump version so we can tell a difference between what has been released and what's in master.
2019-10-23 20:57:04 +13:00
571b33a11c Merge pull request #857 from andrasio/group-by
Can group rows by given column name.
2019-10-23 18:25:52 +13:00
07b90f4b4b Merge pull request #866 from andrasio/color-external
color escaped external command.
2019-10-22 20:16:03 -05:00
f1630da2cc Suggest a column name in case one unknown column is supplied. 2019-10-22 20:10:42 -05:00
16751b5dee color escaped external command. 2019-10-22 19:29:45 -05:00
29ec9a436a Merge pull request #864 from nushell/coloring_in_tokens
Coloring in tokens
2019-10-22 16:38:16 -07:00
6a7c00eaef Finish the job of moving shapes into the stream
This commit should finish the `coloring_in_tokens` feature, which moves
the shape accumulator into the token stream. This allows rollbacks of
the token stream to also roll back any shapes that were added.

This commit also adds a much nicer syntax highlighter trace, which shows
all of the paths the highlighter took to arrive at a particular coloring
output. This change is fairly substantial, but really improves the
understandability of the flow. I intend to update the normal parser with
a similar tracing view.

In general, this change also fleshes out the concept of "atomic" token
stream operations.

A good next step would be to try to make the parser more
error-correcting, using the coloring infrastructure. A follow-up step
would involve merging the parser and highlighter shapes themselves.
2019-10-22 16:19:22 -07:00
82b24d9beb Merge pull request #863 from andrasio/cov-enter
Cover failure not found files cases.
2019-10-22 08:24:48 -05:00
a317072e4e Cover failure not found files cases. 2019-10-22 08:08:24 -05:00
5b701cd197 Merge pull request #862 from Detegr/master
Fix `enter` crashing on nonexistent file
2019-10-22 07:40:23 -05:00
8f035616a0 Fix enter crashing on nonexistent file
Fixes #839
2019-10-22 15:22:47 +03:00
81f8ba9e4c Merge pull request #861 from andrasio/from_xml-cov
baseline coverage for xml parsing.
2019-10-22 04:10:36 -05:00
380ab19910 Merge pull request #858 from Charles-Schleich/master
added Docs for sort-by command
2019-10-22 03:49:18 -05:00
4329629ee9 baseline coverage for xml parsing. 2019-10-22 03:47:59 -05:00
39fde52d8e added Docs for sort-by command 2019-10-21 17:59:20 +02:00
0611f56776 Can group cells by given column name. 2019-10-20 18:42:07 -05:00
8923e91e39 Merge pull request #856 from andrasio/value-improvements
Improvements to Value mutable operations.
2019-10-21 06:57:36 +13:00
d6e6811bb9 Merge pull request #854 from jdvr/master
#194 Connect `rm` command to platform's recycle bin
2019-10-21 05:16:48 +13:00
f24bc5c826 Improvements to Value mutable operations. 2019-10-20 06:55:56 -05:00
c209d0d487 194 Fixed file format 2019-10-19 22:52:39 +02:00
74dddc880d "#194 Added trash switch checked before normal rm command action" 2019-10-19 22:31:18 +02:00
f3c41bbdf1 Merge pull request #851 from t-hart/pr/remove-unwrap-unit
Deletes impl From<&str> for Unit
2019-10-20 07:29:07 +13:00
c45ddc8f22 Merge pull request #848 from andrasio/column_path-inc
Inc plugin increments appropiately given a table containing a version.
2019-10-20 07:27:47 +13:00
84a98995bf Merge pull request #845 from t-hart/from-ssv/headers-as-markers
`from-ssv` logic updated
2019-10-20 07:26:04 +13:00
ed83449514 Merge pull request #808 from notryanb/plugin-average
Average Plugin
2019-10-20 07:23:22 +13:00
9eda573a43 filter out the files that have the same size on multiple operating systems 2019-10-18 20:43:37 -04:00
4f91d2512a add a test to calculate average of bytes 2019-10-18 20:43:37 -04:00
2f5eeab567 fix typos and incorrect commands 2019-10-18 20:43:37 -04:00
f9fbb0eb3c add docs for average and give more specific examples for sum 2019-10-18 20:43:37 -04:00
43fbf4345d remove comment and add test for averaging integers 2019-10-18 20:43:37 -04:00
8262c2dd33 add support for average on byte columns and fmt the code 2019-10-18 20:43:37 -04:00
0e86430ea3 get very basic average working 2019-10-18 20:43:37 -04:00
fc1301c92d #194 Added trash crate and send files to the trash using a flag 2019-10-19 00:41:24 +02:00
e913e26c01 Deletes impl From<&str>
The code still compiles, so this doesn't seem to break anything. That also means
it's not critical to fix it, but having dead code around isn't great either.
2019-10-18 20:02:24 +02:00
5ce4b12cc1 Inc plugin increments appropiately given a table containing a version in it. 2019-10-18 07:30:36 -05:00
94429d781f Merge pull request #847 from Detegr/master
Fix size comparison in 'where size'
2019-10-18 09:27:02 +13:00
321629a693 Fix size comparison in 'where size'
Fixes #840
2019-10-17 22:57:02 +03:00
f21405399c Formats file. 2019-10-17 09:56:06 +02:00
305ca11eb5 Changes the parsing to use the full value of the final column.
Previously it would split the last column on the first separator value found
between the start of the column and the end of the row. Changing this to using
everything from the start of the column to the end of the string makes it behave
more similarly to the other columns, making it less surprising.
2019-10-17 09:40:00 +02:00
9b1ff9b566 Updates the table creation logic.
The table parsing/creation logic has changed from treating every line the same
to processing each line in context of the column header's placement. Previously,
lines on separate rows would go towards the same column as long as they were the
same index based on separator alone. Now, each item's index is based on vertical
alignment to the column header.

This may seem brittle, but it solves the problem of some tables operating with
empty cells that would cause remaining values to be paired with the wrong
column.

Based on kubernetes output (get pods, events), the new method has shown to have
much greater success rates for parsing.
2019-10-17 00:25:43 +02:00
a0ed6ea3c8 Adds new tests and updates old ones.
New tests are added to test for additional cases that might be trickier to
handle with the new logic.

Old tests are updated where their expectations are no longer expected to hold true.
For instance: previously, lines would be treated separately, allowing any index
offset between columns on different rows, as long as they had the same row index
as decided by a separator. When this is no longer the case, some things need to
be adjusted.
2019-10-17 00:17:58 +02:00
4a6529973e Merge pull request #844 from nushell/unknown-value
Rename <unknown> to <value>
2019-10-17 08:24:15 +13:00
9a02fac0e5 Rename <unknown> to <value> 2019-10-17 07:28:49 +13:00
2c6a9e9e48 Merge pull request #838 from jonathandturner/master
Update cargo.lock
2019-10-16 15:18:09 +13:00
d91b735442 Update cargo.lock 2019-10-16 15:09:47 +13:00
7d3025176f Merge pull request #835 from t-hart/from-ssv/variable-separator
`from-ssv`: user-defined number of spaces to split on
2019-10-16 11:04:28 +13:00
74111dddb7 Merge pull request #836 from sdfnz/master
Added documentation for the sum command
2019-10-15 16:54:21 -05:00
74b0e4e541 Adds more info to the usage string. 2019-10-15 23:20:06 +02:00
587bb13be5 Updates readme with new name of flag. 2019-10-15 23:19:16 +02:00
79d3237bf5 Merge pull request #837 from nushell/bump-dep
Bump dep for language-reporting
2019-10-16 09:13:48 +13:00
f8d44e732b Updates default minimum spaces to allow single spaces by default. 2019-10-15 22:05:47 +02:00
0d2044e72e Changes flag to minimum-spaces. 2019-10-15 22:05:32 +02:00
1bb301aafa Bump dep for language-reporting 2019-10-16 08:54:46 +13:00
5635b8378d Added documentation for the sum command 2019-10-15 14:23:32 -05:00
294c2c600d Update the usage string to match the readme. 2019-10-15 21:10:15 +02:00
b4c639a5d9 Updates description of command in readme. 2019-10-15 21:01:14 +02:00
e7b37bee08 Adds filter test for named param. 2019-10-15 20:58:46 +02:00
d32e97b812 Implements variable space separator length, version 1. 2019-10-15 20:48:06 +02:00
81affaa584 Adds tests for allowed-spaces option. 2019-10-15 19:10:38 +02:00
f2d54f201d Merge pull request #833 from andrasio/recon
Recon
2019-10-16 05:00:57 +13:00
0373006710 Formatting. 2019-10-15 05:42:24 -05:00
ec2e35ad81 'last' gets last row if no amount desired given. 2019-10-15 05:41:34 -05:00
821ee5e726 count command introduced. 2019-10-15 05:19:06 -05:00
5ed1ed54a6 Move off 'sum' to internal command 'count' for tests. 2019-10-15 05:16:47 -05:00
96ef478fbc Better error messages. 2019-10-15 04:18:35 -05:00
3f60c9d416 'first' gets first row if no amount desired given. 2019-10-15 04:17:55 -05:00
ed39377840 Merge pull request #832 from nushell/bump_version
Bump the version ahead of release
2019-10-15 18:59:31 +13:00
e250a3f213 Update README.md 2019-10-15 18:52:15 +13:00
3a99456371 Bump the version ahead of release 2019-10-15 18:41:05 +13:00
bd6d8189f8 Merge pull request #830 from t-hart/pull-req/from-master
[DRAFT] Adds `from-ssv` command.
2019-10-15 18:28:43 +13:00
452b5c58e8 Update README.md 2019-10-15 15:38:22 +13:00
d1ebc55ed7 Merge pull request #831 from nushell/coloring_in_tokens
Start moving coloring into the token stream
2019-10-14 18:31:21 -07:00
f20f3f56c7 Start moving coloring into the token stream
The benefit of this is that coloring can be made atomic alongside token
stream forwarding.

I put the feature behind a flag so I can continue to iterate on it
without possibly regressing existing functionality. It's a lot of places
where the flags have to go, but I expect it to be a short-lived flag,
and the flags are fully contained in the parser.
2019-10-14 16:11:00 -07:00
65008bb912 Deletes nix-specific configuration. 2019-10-15 00:25:55 +02:00
d21389d549 Removes unwrap.
A rogue unwrap had been left in the code, but has now been replaced by an option.
2019-10-15 00:24:32 +02:00
f858a127ad Merge pull request #829 from thegedge/fix-multiple-values-for-external-command
Fix bug with multiple input objects to an external command.
2019-10-15 11:22:46 +13:00
de12393eaf Updates shell.nix. 2019-10-14 23:25:52 +02:00
b2c53a0967 Updates commands to work after tag is no longer copy. 2019-10-14 23:14:45 +02:00
65546646a7 Pull in upstream changes. 2019-10-14 23:05:52 +02:00
ee8cd671cb Fix bug with multiple input objects to an external command.
Previously, we would build a command that looked something like this:

  <ex_cmd> "$it" "&&" "<ex_cmd>" "$it"

So that the "&&" and "<ex_cmd>" would also be arguments to the command,
instead of a chained command. This commit builds up a command string
that can be passed to an external shell.
2019-10-14 16:47:12 -04:00
d4df70c53f Merge branch 'refactor/add-tests' 2019-10-14 22:03:47 +02:00
43ead45db6 Removes rust_src_path and ssl_cert_file vars. 2019-10-14 22:03:17 +02:00
22d2360c4b Adds conversion test for leading whitespace.
Refactors string parsing into a separate function.
2019-10-14 22:00:25 +02:00
d38b8cf851 Merge pull request #827 from andrasio/external-color
Color escaped externals.
2019-10-15 08:28:34 +13:00
43cf52275b Color escaped externals. 2019-10-14 14:09:44 -05:00
104b7824f5 Updates return types. 2019-10-14 16:34:06 +02:00
a9293f62a8 Adds some initial ideas for refactoring. 2019-10-14 09:43:54 +02:00
0b210ce5bf Filters out empty lines before table creation. 2019-10-14 07:48:19 +02:00
38225d0dba Removes extra newline 2019-10-14 07:48:10 +02:00
473b6f727c Merge pull request #822 from jonathandturner/fix_707
Fix confusing unnamed column and crash
2019-10-14 18:46:37 +13:00
63039666b0 Changes from_ssv_to_string_value to return an Option. 2019-10-14 07:37:34 +02:00
a4a1588fbc Fix confusing unnamed column and crash 2019-10-14 18:28:54 +13:00
4eafb22d5b Merge pull request #821 from jonathandturner/fix_809
Don't panick of no suggestions are found
2019-10-14 18:17:16 +13:00
aa09967173 Merge pull request #820 from jonathandturner/fix_815
Fixes crash if external is not found
2019-10-14 18:11:30 +13:00
7c40aed738 Don't panick of no suggestions are found 2019-10-14 18:00:10 +13:00
6c0bf6e0ab Fix panic if external is not found 2019-10-14 17:48:27 +13:00
20e891db6e Move variable assignment to clarify use. 2019-10-13 23:10:54 +02:00
38b5979881 Make usage string clearer. 2019-10-13 23:09:24 +02:00
8422d40e2c Add from-ssv to readme. 2019-10-13 23:09:10 +02:00
de1c4e6c88 Implements from-ssv 2019-10-13 22:50:45 +02:00
648d4865b1 Adds unimplemented module, tests. 2019-10-13 21:15:30 +02:00
7d4fec4db3 Merge pull request #817 from thegedge/bump-heim
Bump heim in Cargo.toml to match Cargo.lock
2019-10-14 07:41:38 +13:00
0f7e73646f Bump heim in Cargo.toml to match Cargo.lock 2019-10-13 14:21:44 -04:00
bd6ca75032 Merge pull request #814 from thegedge/fix-ls-bug-with-broken-symlinks
Ignore errors in `ls`.
2019-10-14 05:52:02 +13:00
341cc1ea63 Ignore errors in ls.
`std::fs::metadata` will attempt to follow symlinks, which results in a
"No such file or directory" error if the path pointed to by the symlink
does not exist. This shouldn't prevent `ls` from succeeding, so we
ignore errors.

Also, switching to use of `symlink_metadata` means we get stat info on
the symlink itself, not what it points to. This means `ls` will now
include broken symlinks in its listing.
2019-10-13 12:26:31 -04:00
2716bb020f Fix #811 (#813) 2019-10-13 17:53:58 +13:00
193b00764b Stream support (#812)
* Moves off of draining between filters. Instead, the sink will pull on the stream, and will drain element-wise. This moves the whole stream to being lazy.
* Adds ctrl-c support and connects it into some of the key points where we pull on the stream. If a ctrl-c is detect, we immediately halt pulling on the stream and return to the prompt.
* Moves away from having a SourceMap where anchor locations are stored. Now AnchorLocation is kept directly in the Tag.
* To make this possible, split tag and span. Span is largely used in the parser and is copyable. Tag is now no longer copyable.
2019-10-13 17:12:43 +13:00
8ca678440a Merge pull request #810 from nushell/feature-flags
Feature flagging infrastructure
2019-10-11 17:47:25 -07:00
439889dcef Feature flagging infrastructure
This commit adds the ability to work on features behind a feature flag
that won't be included in normal builds of nu.

These features are not exposed as Cargo features, as they reflect
incomplete features that are not yet stable.

To create a feature, add it to `features.toml`:

```toml
[hintsv1]

description = "Adding hints based on error states in the highlighter"
enabled = false
```

Each feature in `features.toml` becomes a feature flag accessible to `cfg`:

```rs
println!("hintsv1 is enabled");
```

By default, features are enabled based on the value of the `enabled` field.

You can also enable a feature from the command line via the
`NUSHELL_ENABLE_FLAGS` environment variable:

```sh
$ NUSHELL_ENABLE_FLAGS=hintsv1 cargo run
```

You can enable all flags via `NUSHELL_ENABLE_ALL_FLAGS`.

This commit also updates the CI setup to run the build with all flags off and
with all flags on. It also extracts the linting test into its own
parallelizable test, which means it doesn't need to run together with every
other test anymore.

When working on a feature, you should also add tests behind the same flag. A
commit is mergable if all tests pass with and without the flag, allowing
incomplete commits to land on master as long as the incomplete code builds and
passes tests.
2019-10-11 17:19:44 -07:00
5ec6bac7d9 Removes redundant parens. 2019-10-11 21:39:11 +02:00
af2ec60980 Shell.nix cleanup. 2019-10-11 21:13:00 +02:00
f0ca0312f3 Adds racer, formats shell.nix 2019-10-11 19:06:24 +02:00
3317b137e5 Merge pull request #728 from nushell/better-pseudo-blocks
[DON'T MERGE] Overhaul the expansion system
2019-10-11 17:28:33 +13:00
c2c10e2bc0 Overhaul the coloring system
This commit replaces the previous naive coloring system with a coloring
system that is more aligned with the parser.

The main benefit of this change is that it allows us to use parsing
rules to decide how to color tokens.

For example, consider the following syntax:

```
$ ps | where cpu > 10
```

Ideally, we could color `cpu` like a column name and not a string,
because `cpu > 10` is a shorthand block syntax that expands to
`{ $it.cpu > 10 }`.

The way that we know that it's a shorthand block is that the `where`
command declares that its first parameter is a `SyntaxShape::Block`,
which allows the shorthand block form.

In order to accomplish this, we need to color the tokens in a way that
corresponds to their expanded semantics, which means that high-fidelity
coloring requires expansion.

This commit adds a `ColorSyntax` trait that corresponds to the
`ExpandExpression` trait. The semantics are fairly similar, with a few
differences.

First `ExpandExpression` consumes N tokens and returns a single
`hir::Expression`. `ColorSyntax` consumes N tokens and writes M
`FlatShape` tokens to the output.

Concretely, for syntax like `[1 2 3]`

- `ExpandExpression` takes a single token node and produces a single
  `hir::Expression`
- `ColorSyntax` takes the same token node and emits 7 `FlatShape`s
  (open delimiter, int, whitespace, int, whitespace, int, close
  delimiter)

Second, `ColorSyntax` is more willing to plow through failures than
`ExpandExpression`.

In particular, consider syntax like

```
$ ps | where cpu >
```

In this case

- `ExpandExpression` will see that the `where` command is expecting a
  block, see that it's not a literal block and try to parse it as a
  shorthand block. It will successfully find a member followed by an
  infix operator, but not a following expression. That means that the
  entire pipeline part fails to parse and is a syntax error.
- `ColorSyntax` will also try to parse it as a shorthand block and
  ultimately fail, but it will fall back to "backoff coloring mode",
  which parsing any unidentified tokens in an unfallible, simple way. In
  this case, `cpu` will color as a string and `>` will color as an
  operator.

Finally, it's very important that coloring a pipeline infallibly colors
the entire string, doesn't fail, and doesn't get stuck in an infinite
loop.

In order to accomplish this, this PR separates `ColorSyntax`, which is
infallible from `FallibleColorSyntax`, which might fail. This allows the
type system to let us know if our coloring rules bottom out at at an
infallible rule.

It's not perfect: it's still possible for the coloring process to get
stuck or consume tokens non-atomically. I intend to reduce the
opportunity for those problems in a future commit. In the meantime, the
current system catches a number of mistakes (like trying to use a
fallible coloring rule in a loop without thinking about the possibility
that it will never terminate).
2019-10-10 19:30:04 -07:00
d2eb6f6646 Adds .envrc and shell.nix 2019-10-10 21:23:12 +02:00
1ad9d6f199 Overhaul the expansion system
The main thrust of this (very large) commit is an overhaul of the
expansion system.

The parsing pipeline is:

- Lightly parse the source file for atoms, basic delimiters and pipeline
  structure into a token tree
- Expand the token tree into a HIR (high-level intermediate
  representation) based upon the baseline syntax rules for expressions
  and the syntactic shape of commands.

Somewhat non-traditionally, nu doesn't have an AST at all. It goes
directly from the token tree, which doesn't represent many important
distinctions (like the difference between `hello` and `5KB`) directly
into a high-level representation that doesn't have a direct
correspondence to the source code.

At a high level, nu commands work like macros, in the sense that the
syntactic shape of the invocation of a command depends on the
definition of a command.

However, commands do not have the ability to perform unrestricted
expansions of the token tree. Instead, they describe their arguments in
terms of syntactic shapes, and the expander expands the token tree into
HIR based upon that definition.

For example, the `where` command says that it takes a block as its first
required argument, and the description of the block syntactic shape
expands the syntax `cpu > 10` into HIR that represents
`{ $it.cpu > 10 }`.

This commit overhauls that system so that the syntactic shapes are
described in terms of a few new traits (`ExpandSyntax` and
`ExpandExpression` are the primary ones) that are more composable than
the previous system.

The first big win of this new system is the addition of the `ColumnPath`
shape, which looks like `cpu."max ghz"` or `package.version`.
Previously, while a variable path could look like `$it.cpu."max ghz"`,
the tail of a variable path could not be easily reused in other
contexts. Now, that tail is its own syntactic shape, and it can be used
as part of a command's signature.

This cleans up commands like `inc`, `add` and `edit` as well as
shorthand blocks, which can now look like `| where cpu."max ghz" > 10`
2019-10-10 08:27:51 -07:00
f8d337ad29 chore: omit the entire git.rs file when starship is used 2019-10-09 08:42:46 +01:00
47150efc14 chore: switch starship dependency back to the main one 2019-10-09 08:36:55 +01:00
3e14de158b fix(ci): can't push to quay.io (#1)
* ci(github): lowercase ${{ github.actor }}

* ci(github): fix robot username

* fix(ci): fix tag name on suffixed version
2019-10-09 09:58:49 +07:00
e18892000a Merge pull request #802 from twe4ked/improve-cd-docs
Improve cd docs
2019-10-08 20:20:28 -05:00
c8671c719f fix: addressed unused imports and dead code 2019-10-08 21:50:28 +01:00
0412c3a2f8 fix: remove the additional characters from highlighter
This resolves a small integration issue that would make custom prompts problematic (if they are implemented). The approach was to use the highlighter implementation in Helper to insert colour codes to the prompt however it heavily relies on the prompt being in a specific format, ending with a `> ` sequence. However, this should really be the job of the prompt itself not the presentation layer.

For now, I've simply stripped off the additional `> ` characters and passed in just the prompt itself without slicing off the last two characters. I moved the `\x1b[m` control sequence to the prompt creation in `cli.rs` as this feels like the more logical home for controlling what the prompt looks like. I can think of better ways to do this in future but this should be a fine solution for now.

In future it would probably make sense to completely separate prompts (be it, internal or external) from this code so it can be configured as an isolated piece of code.
2019-10-08 21:39:58 +01:00
ef3e8eb778 fix: update Cargo.lock with correct hash for starship fork 2019-10-08 21:16:52 +01:00
fb8cfeb70d feat: starship prompt
Kind of touches on #356 by integrating the Starship prompt directly into the shell.

Not finished yet and has surfaced a potential bug in rustyline anyway. It depends on https://github.com/starship/starship/pull/509 being merged so the Starship prompt can be used as a library.

I could have tackled #356 completely and implemented a full custom prompt feature but I felt this was a simpler approach given that Starship is both written in Rust so shelling out isn't necessary and it already has a bunch of useful features built in.

However, I would understand if it would be preferable to just scrap integrating Starship directly and instead implement a custom prompt system which would facilitate simply shelling out to Starship.
2019-10-08 16:25:12 +01:00
4d70255696 Add documentation for cd - 2019-10-08 18:32:42 +11:00
77c34acb03 Whitespace 2019-10-08 18:32:42 +11:00
e72bc8ea8b Remove unneeded - 2019-10-08 18:32:39 +11:00
a882e640e4 Merge pull request #793 from chhetripradeep/pchhetri/enter
Add documentation for the enter command
2019-10-08 06:02:49 +13:00
c09d866a77 Add documentation for the enter command 2019-10-07 23:21:58 +08:00
4467e59122 Merge pull request #792 from chhetripradeep/pchhetri/open
Add documentation for the open command
2019-10-07 11:17:28 +11:00
9c096d320a Merge pull request #797 from chhetripradeep/pchhetri/fetch
Add documentation for the fetch command
2019-10-07 11:16:36 +11:00
93ae5043cc ci(github): change REGISTRY to quay.io 2019-10-07 03:53:07 +07:00
b134394319 ci(github): refactor docker related run scripts 2019-10-07 03:14:51 +07:00
b163775112 ci(github): install cross from release page
Instead of compiling `cross` via `cargo install`,
downloading binary executable from release page will speedup the CI
2019-10-07 02:53:23 +07:00
8bd035f51d ci(github): renew trigger definition
There is an update in workflow syntax docs
https://help.github.com/en/articles/workflow-syntax-for-github-actions#filter-pattern-cheat-sheet
2019-10-07 02:51:35 +07:00
9f15017032 Add documentation for the fetch command 2019-10-07 02:17:57 +08:00
81fec11f88 Add documentation for the open command 2019-10-07 02:08:20 +08:00
8a6a688131 Merge pull request #795 from chhetripradeep/pchhetri/inc
Add documentation for the inc command
2019-10-07 04:35:08 +11:00
77a4de31fa Merge pull request #794 from chhetripradeep/pchhetri/sys
Add documentation for the sys command
2019-10-07 04:33:51 +11:00
09e88d127e Merge pull request #791 from chhetripradeep/pchhetri/trim
Add documentation for the trim command
2019-10-07 04:30:52 +11:00
7ff5734d5d Add documentation for the inc command 2019-10-06 23:30:52 +08:00
1d19595996 Add documentation for the sys command 2019-10-06 23:20:48 +08:00
7d115da782 Add documentation for the trim command 2019-10-06 22:35:38 +08:00
b066775630 Merge pull request #789 from cristicismas/patch-1
Update cd.md to look better
2019-10-04 16:24:42 -05:00
8bb6bcb6eb Merge pull request #790 from mfarberbrodsky/add-nth-docs
Add documentation for nth command
2019-10-04 16:24:06 -05:00
20031861b9 Add documentation for nth command 2019-10-04 17:37:11 +03:00
eb297d3b8f Update cd.md to look better 2019-10-04 15:10:46 +03:00
8faa0126eb Merge pull request #784 from coolshaurya/to-dash-sth-docs
Added docs for most of the to-sth commands
2019-10-03 21:47:00 -05:00
6aec03708f Fix minor typo 2019-10-04 06:44:45 +05:30
2f7b1e4282 Added improvements suggested by @andrasio
Added `open file.sth | to-sth` type examples
Also did a format conversion example with `open jonathon.xml | to-json` in to-json.md
2019-10-04 06:40:16 +05:30
7492131142 Merge pull request #770 from rnxpyke/master
add regex match plugin
2019-10-03 14:20:41 -05:00
3c6ee63e59 Merge pull request #777 from JonnyWalker81/fix-get-panic
Attempt at fixing `get` command panic.
2019-10-03 14:02:51 -05:00
45ad18f654 Merge pull request #785 from Charles-Schleich/master
Created Docs for env command
2019-10-03 14:00:51 -05:00
01829f04d5 Merge pull request #783 from notryanb/document-last
add documentation for the last command
2019-10-03 13:59:41 -05:00
cc1c471877 Merge pull request #779 from pema99/lines-doc
Add documentation for lines
2019-10-03 13:58:30 -05:00
de14f9fce8 Merge pull request #781 from coolshaurya/add-command-docs
Create docs for add command
2019-10-03 13:38:11 -05:00
6c3ed1dbc2 Merge pull request #782 from coolshaurya/docs-edit-command
Create docs for edit command
2019-10-03 13:37:49 -05:00
cf0fa3141a Created Docs for env command 2019-10-03 20:13:22 +02:00
539e232f3c Added docs for most of the to-sth commands
Partial fix of issue #711
Docs for the following commands were added -
to-csv
to-json
to-toml
to-tsv
to-url
to-yaml

Docs for to-db , to-bson , to-sqlite have not been added as I don't recognize and understand those formats.
2019-10-03 19:07:48 +05:30
9ed889ccbb fix grammar 2019-10-03 08:18:51 -04:00
872e26b524 add documentation for the last command 2019-10-03 08:14:59 -04:00
5bfff0c39b Create docs for edit command
Partial fix of issue #711
2019-10-03 16:54:28 +05:30
0505a9d6f7 Create docs for add command
Partial fix of issue #711
2019-10-03 16:27:04 +05:30
9181a046ec use correct argument for error message 2019-10-03 08:21:24 +02:00
1b0eaac470 Add documentation for lines 2019-10-03 06:09:01 +02:00
e54cd98a9c Put code into None case of last match. 2019-10-02 20:41:53 -07:00
f3eb4fb24e Attempt at fixing get command panic.
If possible matches are not found then check if the passed in `obj`
parameter is a `string` or a `path`, if so then return it.  I am not
sure this is the right fix, but I figured I would make an attempt and
get a conversation started about it.
2019-10-02 20:16:27 -07:00
04854d5d99 Merge pull request #776 from gilesv/where-command
Create where.md
2019-10-03 15:38:59 +13:00
124a814f4d Merge pull request #775 from JonnyWalker81/vi-textview-scroll
Added Vi support for scrolling in the textview command.
2019-10-03 15:19:11 +13:00
2e1670fcb8 Add documentation for where command 2019-10-02 22:49:05 -03:00
7d2747ea9a Added Vi support for scrolling in the textview command. 2019-10-02 18:45:23 -07:00
36f2b09cad run rustfmt on match plugin 2019-10-02 22:41:52 +02:00
be51aad9ad remove unused imports on match plugin 2019-10-02 22:24:37 +02:00
97695b74dd Merge pull request #771 from notryanb/document-first
add documentation file for first command
2019-10-03 09:09:33 +13:00
9d84e47214 add documentation file for first command 2019-10-02 15:49:44 -04:00
9fb9adb6b4 add regex match plugin 2019-10-02 20:56:43 +02:00
91e6d31dc6 Merge pull request #753 from JesterOrNot/master
Style README
2019-10-03 06:28:52 +13:00
9a1c537854 Merge pull request #764 from coolshaurya/command-version-docs
Create docs for version command
2019-10-03 06:28:04 +13:00
2476c8d579 Merge pull request #762 from coolshaurya/reverse-command-docs
Create docs for reverse command
2019-10-03 06:26:06 +13:00
27e59ea49c Merge pull request #760 from coolshaurya/shells-command-docs
Created docs for shells command
2019-10-03 06:24:35 +13:00
27882efd6b Merge pull request #756 from coolshaurya/exit-command-docs
Create exit command documentation
2019-10-03 06:21:26 +13:00
5e98751c66 Merge pull request #767 from jerodsanto/patch-1
Add Changelog episode badge to README
2019-10-03 06:20:26 +13:00
8dec2da564 Merge pull request #768 from nushell/try_fix_pipelines
Trying to fix Azure Pipelines
2019-10-03 05:43:47 +13:00
27272d3754 Update azure-pipelines.yml 2019-10-03 05:27:03 +13:00
f689434bbc Update azure-pipelines.yml 2019-10-03 05:06:28 +13:00
03728c1868 Update azure-pipelines.yml 2019-10-03 04:55:29 +13:00
ce771903e5 Trying to fix Azure Pipelines 2019-10-03 04:46:49 +13:00
c78bce2af4 Add Changelog episode badge to README 2019-10-02 09:34:08 -05:00
0b3c9b760e Create docs for version command
Partial fix of #711
2019-10-02 15:47:56 +05:30
7e7eba8f4d Create docs for reverse command
Partial fix of issue #711
this command could be described better but I don't know how
2019-10-02 15:03:28 +05:30
a77c222db0 Created docs for shells command
Partial fix of issue #711
The second example is taken from the book, specifically the section https://book.nushell.sh/en/shells_in_shells#going-beyond-directories
2019-10-02 13:37:43 +05:30
149961e8f1 Merge pull request #755 from coolshaurya/cd-command-docs
Make docs for the cd command
2019-10-01 21:27:08 -05:00
caf3015e66 Improved exit command docs 2019-10-02 06:55:30 +05:30
459bfdd783 Merge pull request #757 from yahsinhuangtw/add-help-doc
Add documentation for help
2019-10-02 05:51:42 +13:00
a2f1cca85c Merge pull request #752 from nalshihabi/add-echo-doc
Add echo command documentation
2019-10-02 04:32:15 +13:00
c09b4b045f Merge pull request #746 from marcelocg/master
Document date command
2019-10-02 04:23:54 +13:00
94d81445eb Add document for help 2019-10-01 23:20:58 +08:00
94744c626c Fix typo in date.cmd 2019-10-01 11:21:56 -03:00
e62a2509ae Create exit command documentation
Partial fix of issue #711
Some parts have been copied from the fish documentation
2019-10-01 19:40:16 +05:30
417ac4b69e Improve cd docs
Used format in PR#746
Added another example
Removed unnecessary text
2019-10-01 19:23:10 +05:30
b7bf31df99 Make docs for the cd command ; partially solves #711 2019-10-01 18:45:38 +05:30
a7a0f48286 Update README.md 2019-10-01 06:46:04 -05:00
1bf0f7110a Update README.md 2019-10-01 06:44:06 -05:00
ad53eb4e76 Update README.md 2019-10-01 06:39:18 -05:00
c81d20a069 Update README.md 2019-10-01 06:37:27 -05:00
08df76486d Update README.md 2019-10-01 06:00:00 -05:00
fe3753ea68 more style changes 2019-10-01 05:58:56 -05:00
abf671da1b misc style changes 2019-10-01 05:54:59 -05:00
91b4d27931 add capitalization in readme
discord and twitter should be uppercase
2019-10-01 05:32:21 -05:00
310897897e Changed the location of the open in gitpod button
In my opinion, this looks a lot better
2019-10-01 05:28:54 -05:00
8ba917b704 Add echo command documentation 2019-10-01 06:14:56 -04:00
219da892b2 Document date command 2019-09-30 22:30:17 -03:00
bbb4cc7d5f Merge pull request #745 from iggy14750/master
Adds a word to README for readabilty
2019-10-01 12:52:26 +13:00
9d04a7cc40 Adds a word to README for readabilty 2019-09-30 18:51:35 -04:00
70d0ae7b42 Merge pull request #744 from DrSensor/repology
Add repology.org badge to Packaging status
2019-10-01 10:19:35 +13:00
ce9e4a61e7 Add repology.org badge to Packaging status 2019-10-01 03:39:16 +07:00
af8e2f6961 Merge pull request #737 from JonnyWalker81/fix-last-command-crash
Fixed last command crash
2019-09-30 18:13:34 +13:00
093b9c1c5b Fixed last command crash
When the last command has an input value larger than the data its
operating on it would crash.  Added a check to ensure there are enough
elements to take.
2019-09-29 20:20:18 -07:00
348d75112f Merge pull request #736 from pizzafox/fix/https-links
Use HTTPS where possible
2019-09-30 14:43:15 +13:00
3c7b1ba854 Merge pull request #735 from rnxpyke/master
remove trailing newline after external command
2019-09-30 12:16:36 +13:00
b7a8758845 Merge pull request #729 from JonnyWalker81/post-headers
Added support for more `post` headers.
2019-09-30 12:00:22 +13:00
3812037e2a remove trailing newline after external command 2019-09-30 00:43:23 +02:00
c5fdbdb8a1 Update README.md 2019-09-30 11:29:59 +13:00
c15b5df674 Update README.md 2019-09-30 11:10:19 +13:00
00f0fd2873 Update README.md 2019-09-30 11:09:52 +13:00
7269cf7427 Merge pull request #733 from vsoch/fix/docker-nightly-build
Don't run nu at end of release build
2019-09-30 10:53:24 +13:00
83d82a09b2 Better handling of unexpected error case. 2019-09-29 14:43:39 -07:00
64345b2985 dont run nu at end of release build
Signed-off-by: Vanessa Sochat <vsochat@stanford.edu>
2019-09-29 17:18:35 -04:00
9c23d78513 docs: use HTTPS where possible
Signed-off-by: Jonah Snider <me@jonahsnider.ninja>
2019-09-29 09:03:51 -10:00
ff92123d93 Merge remote-tracking branch 'upstream/master' into post-headers 2019-09-29 01:33:21 -07:00
e1357a9541 Handle unexpected input and some cleanup. 2019-09-29 01:29:43 -07:00
3b7aa5124c Merge pull request #731 from jonathandturner/anchor
Clarify names of metadata
2019-09-29 18:45:43 +13:00
ce947d70b0 Rename SpanSource to AnchorLocation 2019-09-29 18:18:59 +13:00
caed87c125 Rename origin to anchor 2019-09-29 18:13:56 +13:00
e12ba5be8f Added support for more post headers. 2019-09-28 19:03:10 -07:00
d52e087453 Merge pull request #695 from JonnyWalker81/initial-docker-command-impl
Initial docker command impl
2019-09-29 05:09:19 +13:00
982ebacddd Merge pull request #725 from JesterOrNot/master
Gitpod
2019-09-29 05:07:21 +13:00
ee2f54fbb0 Merge pull request #726 from mlbright/master
Fix 'Shell commands' table markdown formatting
2019-09-28 18:10:16 +12:00
4f5c0314cf Fix 'Shell commands' table markdown formatting 2019-09-28 00:36:54 -04:00
542a3995ea Merge remote-tracking branch 'upstream/master' into initial-docker-command-impl 2019-09-27 20:22:30 -07:00
4af0dbe441 Removed commented code and added feature to Cargo.toml 2019-09-27 20:21:30 -07:00
78ccd4181c Update .gitpod.yml 2019-09-27 21:46:04 -05:00
680aeb12c2 modified: README.md 2019-09-28 01:46:57 +00:00
ddcf0b4f5f Merge pull request #724 from est31/stable_async
Remove uses of nightly Rust, switch compiler to beta
2019-09-28 13:39:02 +12:00
74e60fbef8 Newbra (#1)
Newbra
2019-09-27 20:31:54 -05:00
b4c783f23d modified: .gitpod.yml 2019-09-28 01:18:55 +00:00
6f6d2abdac modified: .gitpod.yml 2019-09-28 01:18:00 +00:00
b123f35d4b Switch pinned compiler to Rust beta 2019-09-28 03:11:01 +02:00
02d6614ae2 Use language-reporting from git as it supports Rust stable 2019-09-28 03:11:01 +02:00
20de0ea01f Update .gitpod.yml 2019-09-27 20:09:21 -05:00
9f352ace23 Update .gitpod.Dockerfile 2019-09-27 20:08:05 -05:00
48cbc5b23c Update .gitpod.Dockerfile 2019-09-27 20:06:15 -05:00
aa495f4d74 Update .gitpod.Dockerfile 2019-09-27 20:04:24 -05:00
0d8768b827 Update .gitpod.Dockerfile 2019-09-27 20:01:50 -05:00
0d076d97be Update .gitpod.Dockerfile 2019-09-27 19:59:30 -05:00
5b5c33a86f Update .gitpod.Dockerfile 2019-09-27 19:58:36 -05:00
12f34cc698 Update .gitpod.yml 2019-09-27 19:54:42 -05:00
ac116f4f7c Update .gitpod.yml 2019-09-27 19:54:05 -05:00
6617731d5b Update .gitpod.Dockerfile 2019-09-27 19:51:42 -05:00
f7d5ddbc07 Update Dockerfile 2019-09-27 19:47:28 -05:00
ba778eaff9 modified: docker/Dockerfile 2019-09-28 00:31:16 +00:00
1801c006ec Remove futures-async-stream dependency 2019-09-28 02:07:28 +02:00
7a124518c3 Remove use of nightly features 2019-09-28 02:07:09 +02:00
1183d28b15 Remove uses of async_stream_block 2019-09-28 02:05:18 +02:00
1da6ac8de7 i
new file:   .gitpod.Dockerfile
	new file:   .gitpod.yml
2019-09-27 23:01:34 +00:00
2b89ddfb9e Merge pull request #713 from est31/stable_async
Use async-stream crate to replace most async_stream_block invocations
2019-09-28 06:12:38 +12:00
29734a1dce Merge pull request #720 from BradyBromley/master
Changed wording in README.md
2019-09-28 05:55:21 +12:00
def33206d9 Changed wording in README.md 2019-09-27 09:48:26 -07:00
6aad0b8443 Remove async_stream_block from the prelude
... to indicate deprecation of its use
2019-09-26 02:39:59 +02:00
9891e5ab81 Use async-stream crate to replace most async_stream_block invocations 2019-09-26 02:39:20 +02:00
7113c702ff Merge pull request #706 from landaire/ctrlc_config
feat(cli): add `ctrlc_exit` config option
2019-09-26 09:22:11 +12:00
54edf571af Merge pull request #712 from nushell/andrasio-doc
More command documentation instructions.
2019-09-25 13:31:40 -05:00
f85968aba4 More command documentation instructions. 2019-09-25 11:35:58 -05:00
440f553aa8 Merge pull request #710 from andrasio/docs-command
Commands documenting instructions.
2019-09-25 11:32:03 -05:00
a492b019fe Commands documenting instructions. 2019-09-25 11:15:00 -05:00
2941740df6 Merge remote-tracking branch 'upstream/master' into initial-docker-command-impl 2019-09-24 20:43:03 -07:00
f0b638063d Transfered Docker to a plugin instead of a Command. 2019-09-24 20:42:18 -07:00
0377efdc16 feat(cli): add ctrlc_exit config option
This feature allows a user to set `ctrlc_exit` to `true` or `false` in their config to override how multiple CTRL-C invocations are handled. Without this change pressing CTRL-C multiple times will exit nu. With this change applied the user can configure the behavior to behave like other shells where multiple invocations will essentially clear the line.

This fixes #457.
2019-09-24 18:04:53 -07:00
3d89d2961c Merge pull request #705 from piotrek-szczygiel/master
Fix typo in echo usage message
2019-09-25 12:46:35 +12:00
8c240ca3fd Merge pull request #704 from pka/fix-build-without-crossterm
Fix build without crossterm
2019-09-25 12:46:06 +12:00
85cd03f899 Fix typo in echo usage message 2019-09-25 00:15:53 +02:00
3480cdb3b4 Fix build without crossterm 2019-09-24 23:33:30 +02:00
fec83e5164 Merge pull request #703 from andrasio/prevent-fsh-cdfile
Filesystem shell can't cd into files. Ever.
2019-09-24 16:00:53 -05:00
837d12decd Filesystem shell can't cd into files. Ever. 2019-09-24 15:34:30 -05:00
ffa536bea3 Add Cargo.lock 2019-09-25 07:02:35 +12:00
a1f26d947d Merge remote-tracking branch 'upstream/master' into initial-docker-command-impl 2019-09-23 17:57:56 -07:00
e6bdef696d Some cleanup. 2019-09-22 20:19:43 -07:00
707af3f3ca Merge branch 'master' of github.com:jonnywalker81/nushell into initial-docker-command-impl 2019-09-22 18:53:31 -07:00
480467447e Initial Docker command implementation. 2019-09-22 18:49:11 -07:00
fa859f1461 Merge branch 'master' of github.com:nushell/nushell 2019-09-02 21:53:26 -07:00
556f4b2f12 Merge branch 'master' of github.com:nushell/nushell 2019-09-01 23:14:59 -07:00
705 changed files with 61077 additions and 24390 deletions

View File

@ -3,12 +3,27 @@ trigger:
strategy:
matrix:
linux-nightly:
image: ubuntu-16.04
macos-nightly:
linux-stable:
image: ubuntu-18.04
style: 'unflagged'
macos-stable:
image: macos-10.14
windows-nightly:
image: vs2017-win2016
style: 'unflagged'
windows-stable:
image: windows-2019
style: 'unflagged'
linux-nightly-canary:
image: ubuntu-18.04
style: 'canary'
macos-nightly-canary:
image: macos-10.14
style: 'canary'
windows-nightly-canary:
image: windows-2019
style: 'canary'
fmt:
image: ubuntu-18.04
style: 'fmt'
pool:
vmImage: $(image)
@ -16,13 +31,33 @@ pool:
steps:
- bash: |
set -e
curl https://sh.rustup.rs -sSf | sh -s -- -y --no-modify-path --default-toolchain `cat rust-toolchain`
export PATH=$HOME/.cargo/bin:$PATH
if [ -e /etc/debian_version ]
then
sudo apt-get -y install libxcb-composite0-dev libx11-dev
fi
if [ "$(uname)" == "Darwin" ]; then
curl https://sh.rustup.rs -sSf | sh -s -- -y --no-modify-path --default-toolchain "stable"
echo "Installing clippy"
rustup component add clippy --toolchain stable-x86_64-apple-darwin
export PATH=$HOME/.cargo/bin:$PATH
fi
rustup update
rustc -Vv
echo "##vso[task.prependpath]$HOME/.cargo/bin"
rustup component add rustfmt --toolchain `cat rust-toolchain`
rustup component add rustfmt
displayName: Install Rust
- bash: RUSTFLAGS="-D warnings" cargo test --all-features
- bash: RUSTFLAGS="-D warnings" cargo test --all --features stable
condition: eq(variables['style'], 'unflagged')
displayName: Run tests
- bash: RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
condition: eq(variables['style'], 'unflagged')
displayName: Check clippy lints
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo test --all --features stable
condition: eq(variables['style'], 'canary')
displayName: Run tests
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
condition: eq(variables['style'], 'canary')
displayName: Check clippy lints
- bash: cargo fmt --all -- --check
condition: eq(variables['style'], 'fmt')
displayName: Lint

View File

@ -0,0 +1,3 @@
[build]
#rustflags = ["--cfg", "data_processing_primitives"]

1
.dockerignore Normal file
View File

@ -0,0 +1 @@
target

30
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@ -0,0 +1,30 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1.
2.
3.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Configuration (please complete the following information):**
- OS (e.g. Windows):
- Nu version (you can use the `version` command to find out):
- Optional features (if any):
Add any other context about the problem here.

View File

@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: ''
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@ -2,7 +2,7 @@ name: Publish consumable Docker images
on:
push:
tags: ['*.*.*']
tags: ['v?[0-9]+.[0-9]+.[0-9]+*']
jobs:
compile:
@ -13,14 +13,18 @@ jobs:
- x86_64-unknown-linux-musl
- x86_64-unknown-linux-gnu
steps:
- uses: actions/checkout@v1
- run: cargo install cross
- uses: actions/checkout@v2
- name: Install rust-embedded/cross
env: { VERSION: v0.1.16 }
run: >-
wget -nv https://github.com/rust-embedded/cross/releases/download/${VERSION}/cross-${VERSION}-x86_64-unknown-linux-gnu.tar.gz
-O- | sudo tar xz -C /usr/local/bin/
- name: compile for specific target
env: { arch: '${{ matrix.arch }}' }
run: |
cross build --target ${{ matrix.arch }} --release
# leave only the executable file
rm -rd target/${{ matrix.arch }}/release/{*/*,*.d,*.rlib,.fingerprint}
rm -frd target/${{ matrix.arch }}/release/{*/*,*.d,*.rlib,.fingerprint}
find . -empty -delete
- uses: actions/upload-artifact@master
with:
@ -31,6 +35,10 @@ jobs:
name: Build and publish docker images
needs: compile
runs-on: ubuntu-latest
env:
DOCKER_REGISTRY: quay.io/nushell
DOCKER_PASSWORD: ${{ secrets.DOCKER_REGISTRY }}
DOCKER_USER: ${{ secrets.DOCKER_USER }}
strategy:
matrix:
tag:
@ -44,55 +52,67 @@ jobs:
- glibc
- musl
include:
- { tag: alpine, base-image: alpine, arch: x86_64-unknown-linux-musl, plugin: true }
- { tag: slim, base-image: 'debian:stable-slim', arch: x86_64-unknown-linux-gnu, plugin: true }
- { tag: debian, base-image: debian, arch: x86_64-unknown-linux-gnu, plugin: true }
- { tag: glibc-busybox, base-image: 'busybox:glibc', arch: x86_64-unknown-linux-gnu, use-patch: true }
- { tag: musl-busybox, base-image: 'busybox:musl', arch: x86_64-unknown-linux-musl, }
- { tag: musl-distroless, base-image: 'gcr.io/distroless/static', arch: x86_64-unknown-linux-musl, }
- { tag: glibc-distroless, base-image: 'gcr.io/distroless/cc', arch: x86_64-unknown-linux-gnu, use-patch: true }
- { tag: glibc, base-image: scratch, arch: x86_64-unknown-linux-gnu, }
- { tag: musl, base-image: scratch, arch: x86_64-unknown-linux-musl, }
- { tag: alpine, base-image: alpine, arch: x86_64-unknown-linux-musl, plugin: true, use-patch: false}
- { tag: slim, base-image: 'debian:stable-slim', arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: debian, base-image: debian, arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: glibc-busybox, base-image: 'busybox:glibc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: musl-busybox, base-image: 'busybox:musl', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: musl-distroless, base-image: 'gcr.io/distroless/static', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: glibc-distroless, base-image: 'gcr.io/distroless/cc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: glibc, base-image: scratch, arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: false}
- { tag: musl, base-image: scratch, arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
steps:
- uses: actions/checkout@v1
- uses: actions/checkout@v2
- uses: actions/download-artifact@master
with: { name: '${{ matrix.arch }}', path: target/release }
- name: Build and publish exact version
run: |
REGISTRY=${REGISTRY,,}; export TAG=${GITHUB_REF##*/}-${{ matrix.tag }};
run: |-
export DOCKER_TAG=${GITHUB_REF##*/}-${{ matrix.tag }}
export NU_BINS=target/release/$( [ ${{ matrix.plugin }} = true ] && echo nu* || echo nu )
export PATCH=$([ ${{ matrix.use-patch }} = true ] && echo .${{ matrix.tag }} || echo '')
chmod +x $NU_BINS
echo ${{ secrets.DOCKER_REGISTRY }} | docker login docker.pkg.github.com -u ${{ github.actor }} --password-stdin
echo ${DOCKER_PASSWORD} | docker login ${DOCKER_REGISTRY} -u ${DOCKER_USER} --password-stdin
docker-compose --file docker/docker-compose.package.yml build
docker-compose --file docker/docker-compose.package.yml push # exact version
env:
BASE_IMAGE: ${{ matrix.base-image }}
REGISTRY: docker.pkg.github.com/${{ github.repository }}
#region semantics tagging
- name: Retag and push without suffixing version
run: |
- name: Retag and push with suffixed version
run: |-
VERSION=${GITHUB_REF##*/}
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${{ matrix.tag }}
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${VERSION%%.*}-${{ matrix.tag }}
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${VERSION%.*}-${{ matrix.tag }}
docker push ${REGISTRY,,}/nu:${VERSION%.*}-${{ matrix.tag }} # latest patch
docker push ${REGISTRY,,}/nu:${VERSION%%.*}-${{ matrix.tag }} # latest features
docker push ${REGISTRY,,}/nu:${{ matrix.tag }} # latest version
env: { REGISTRY: 'docker.pkg.github.com/${{ github.repository }}' }
latest_version=${VERSION%%%.*}-${{ matrix.tag }}
latest_feature=${VERSION%%.*}-${{ matrix.tag }}
latest_patch=${VERSION%.*}-${{ matrix.tag }}
exact_version=${VERSION}-${{ matrix.tag }}
tags=( ${latest_version} ${latest_feature} ${latest_patch} ${exact_version} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
docker push ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
- name: Retag and push debian as latest
if: matrix.tag == 'debian'
run: |
run: |-
VERSION=${GITHUB_REF##*/}
docker tag ${REGISTRY,,}/nu:${{ matrix.tag }} ${REGISTRY,,}/nu:latest
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${VERSION%.*}
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${VERSION%%.*}
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${VERSION}
docker push ${REGISTRY,,}/nu:${VERSION} # exact version
docker push ${REGISTRY,,}/nu:${VERSION%%.*} # latest features
docker push ${REGISTRY,,}/nu:${VERSION%.*} # latest patch
docker push ${REGISTRY,,}/nu:latest # latest version
env: { REGISTRY: 'docker.pkg.github.com/${{ github.repository }}' }
# ${latest features} ${latest patch} ${exact version}
tags=( ${VERSION%%.*} ${VERSION%.*} ${VERSION} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:latest
docker push ${DOCKER_REGISTRY}/nu:latest
#endregion semantics tagging

240
.github/workflows/release.yml vendored Normal file
View File

@ -0,0 +1,240 @@
name: Create Release Draft
on:
push:
tags: ['[0-9]+.[0-9]+.[0-9]+*']
jobs:
linux:
name: Build Linux
runs-on: ubuntu-latest
steps:
- name: Check out code
uses: actions/checkout@v2
- name: Install libxcb
run: sudo apt-get install libxcb-composite0-dev
- name: Set up cargo
uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
override: true
- name: Build
uses: actions-rs/cargo@v1
with:
command: build
args: --release --all --features=stable
- name: Create output directory
run: mkdir output
- name: Copy files to output
run: |
cp target/release/nu target/release/nu_plugin_* output/
cp README.build.txt output/README.txt
rm output/*.d
rm output/nu_plugin_core_*
rm output/nu_plugin_stable_*
# Note: If OpenSSL changes, this path will need to be updated
- name: Copy OpenSSL to output
run: cp /usr/lib/x86_64-linux-gnu/libssl.so.1.1 output/
- name: Upload artifact
uses: actions/upload-artifact@v2
with:
name: linux
path: output/*
macos:
name: Build macOS
runs-on: macos-latest
steps:
- name: Check out code
uses: actions/checkout@v2
- name: Set up cargo
uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
override: true
- name: Build
uses: actions-rs/cargo@v1
with:
command: build
args: --release --all --features=stable
- name: Create output directory
run: mkdir output
- name: Copy files to output
run: |
cp target/release/nu target/release/nu_plugin_* output/
cp README.build.txt output/README.txt
rm output/*.d
rm output/nu_plugin_core_*
rm output/nu_plugin_stable_*
- name: Upload artifact
uses: actions/upload-artifact@v2
with:
name: macos
path: output/*
windows:
name: Build Windows
runs-on: windows-latest
steps:
- name: Check out code
uses: actions/checkout@v2
- name: Set up cargo
uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
override: true
- name: Add cargo-wix subcommand
uses: actions-rs/cargo@v1
with:
command: install
args: cargo-wix
- name: Build
uses: actions-rs/cargo@v1
with:
command: build
args: --release --all --features=stable
- name: Create output directory
run: mkdir output
- name: Download Less Binary
run: Invoke-WebRequest -Uri "https://github.com/jftuga/less-Windows/releases/download/less-v562.1/less.exe" -OutFile "target\release\less.exe"
- name: Copy files to output
run: |
cp target\release\nu.exe output\
rm target\release\nu_plugin_core_*.exe
rm target\release\nu_plugin_stable_*.exe
cp target\release\nu_plugin_*.exe output\
cp README.build.txt output\README.txt
cp target\release\less.exe output\
# Note: If the version of `less.exe` needs to be changed, update this URL
# Similarly, if `less.exe` is checked into the repo, copy from the local path here
# moved this stuff down to create wix after we download less
- name: Create msi with wix
uses: actions-rs/cargo@v1
with:
command: wix
args: --no-build --nocapture --output target\wix\nushell-windows.msi
- name: Upload installer
uses: actions/upload-artifact@v2
with:
name: windows-installer
path: target\wix\nushell-windows.msi
- name: Upload zip
uses: actions/upload-artifact@v2
with:
name: windows-zip
path: output\*
release:
name: Publish Release
runs-on: ubuntu-latest
needs:
- linux
- macos
- windows
steps:
- name: Check out code
uses: actions/checkout@v2
- name: Determine Release Info
id: info
env:
GITHUB_REF: ${{ github.ref }}
run: |
VERSION=${GITHUB_REF##*/}
MAJOR=${VERSION%%.*}
MINOR=${VERSION%.*}
MINOR=${MINOR#*.}
PATCH=${VERSION##*.}
echo "::set-output name=version::${VERSION}"
echo "::set-output name=linuxdir::nu_${MAJOR}_${MINOR}_${PATCH}_linux"
echo "::set-output name=macosdir::nu_${MAJOR}_${MINOR}_${PATCH}_macOS"
echo "::set-output name=windowsdir::nu_${MAJOR}_${MINOR}_${PATCH}_windows"
echo "::set-output name=innerdir::nushell-${VERSION}"
- name: Create Release Draft
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ github.ref }}
release_name: ${{ steps.info.outputs.version }} Release
draft: true
- name: Create Linux Directory
run: mkdir -p ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}
- name: Download Linux Artifacts
uses: actions/download-artifact@v2
with:
name: linux
path: ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}
- name: Restore Linux File Modes
run: |
chmod 755 ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}/nu*
chmod 755 ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}/libssl*
- name: Create Linux tarball
run: tar -zcvf ${{ steps.info.outputs.linuxdir }}.tar.gz ${{ steps.info.outputs.linuxdir }}
- name: Upload Linux Artifact
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./${{ steps.info.outputs.linuxdir }}.tar.gz
asset_name: ${{ steps.info.outputs.linuxdir }}.tar.gz
asset_content_type: application/gzip
- name: Create macOS Directory
run: mkdir -p ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }}
- name: Download macOS Artifacts
uses: actions/download-artifact@v2
with:
name: macos
path: ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }}
- name: Restore macOS File Modes
run: chmod 755 ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }}/nu*
- name: Create macOS Archive
run: zip -r ${{ steps.info.outputs.macosdir }}.zip ${{ steps.info.outputs.macosdir }}
- name: Upload macOS Artifact
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./${{ steps.info.outputs.macosdir }}.zip
asset_name: ${{ steps.info.outputs.macosdir }}.zip
asset_content_type: application/zip
- name: Create Windows Directory
run: mkdir -p ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }}
- name: Download Windows zip
uses: actions/download-artifact@v2
with:
name: windows-zip
path: ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }}
# TODO: Remove Show
- name: Show Windows Artifacts
run: ls -la ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }}
- name: Create macOS Archive
run: zip -r ${{ steps.info.outputs.windowsdir }}.zip ${{ steps.info.outputs.windowsdir }}
- name: Upload Windows zip
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./${{ steps.info.outputs.windowsdir }}.zip
asset_name: ${{ steps.info.outputs.windowsdir }}.zip
asset_content_type: application/zip
- name: Download Windows installer
uses: actions/download-artifact@v2
with:
name: windows-installer
path: ./
- name: Upload Windows installer
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./nushell-windows.msi
asset_name: ${{ steps.info.outputs.windowsdir }}.msi
asset_content_type: applictaion/x-msi

10
.gitignore vendored
View File

@ -3,6 +3,7 @@
**/*.rs.bk
history.txt
tests/fixtures/nuplayground
crates/*/target
# Debian/Ubuntu
debian/.debhelper/
@ -10,3 +11,12 @@ debian/debhelper-build-stamp
debian/files
debian/nu.substvars
debian/nu/
# macOS junk
.DS_Store
# JetBrains' IDE items
.idea/*
# VSCode's IDE items
.vscode/*

14
.gitpod.Dockerfile vendored Normal file
View File

@ -0,0 +1,14 @@
FROM gitpod/workspace-full
USER gitpod
RUN sudo apt-get update && \
sudo apt-get install -y \
libssl-dev \
libxcb-composite0-dev \
pkg-config \
libpython3.6 \
rust-lldb \
&& sudo rm -rf /var/lib/apt/lists/*
ENV RUST_LLDB=/usr/bin/lldb-8

25
.gitpod.yml Normal file
View File

@ -0,0 +1,25 @@
image:
file: .gitpod.Dockerfile
tasks:
- name: Clippy
init: cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
- name: Testing
init: cargo test --all --features=stable,test-bins
- name: Build
init: cargo build --features=stable
- name: Nu
init: cargo install --path . --features=stable
command: nu
github:
prebuilds:
branches: true
pullRequestsFromForks: true
addLabel: prebuilt-in-gitpod
vscode:
extensions:
- hbenl.vscode-test-explorer@2.15.0:koqDUMWDPJzELp/hdS/lWw==
- Swellaby.vscode-rust-test-adapter@0.11.0:Xg+YeZZQiVpVUsIkH+uiiw==
- serayuzgur.crates@0.4.7:HMkoguLcXp9M3ud7ac3eIw==
- belfz.search-crates-io@1.2.1:kSLnyrOhXtYPjQpKnMr4eQ==
- bungcip.better-toml@0.3.2:3QfgGxxYtGHfJKQU7H0nEw==
- webfreak.debug@0.24.0:1zVcRsAhewYEX3/A9xjMNw==

14
.theia/launch.json Normal file
View File

@ -0,0 +1,14 @@
{
"version": "0.2.0",
"configurations": [
{
"type": "gdb",
"request": "launch",
"name": "Debug Rust Code",
"preLaunchTask": "cargo",
"target": "${workspaceFolder}/target/debug/nu",
"cwd": "${workspaceFolder}",
"valuesFormatting": "parseText"
}
]
}

12
.theia/tasks.json Normal file
View File

@ -0,0 +1,12 @@
{
"tasks": [
{
"command": "cargo",
"args": [
"build"
],
"type": "process",
"label": "cargo",
}
],
}

View File

@ -55,7 +55,7 @@ a project may be further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at wycats@gmail.com. All
reported by contacting the project team at wycats@gmail.com via email or by reaching out to @jturner, @gedge, or @andras_io on discord. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
@ -68,9 +68,9 @@ members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
available at <https://www.contributor-covenant.org/version/1/4/code-of-conduct.html>
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see
https://www.contributor-covenant.org/faq
<https://www.contributor-covenant.org/faq>

62
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,62 @@
# Contributing
Welcome to nushell!
*Note: for a more complete guide see [The nu contributor book](https://github.com/nushell/contributor-book)*
For speedy contributions open it in Gitpod, nu will be pre-installed with the latest build in a VSCode like editor all from your browser.
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/nushell/nushell)
To get live support from the community see our [Discord](https://discordapp.com/invite/NtAbbGn), [Twitter](https://twitter.com/nu_shell) or file an issue or feature request here on [GitHub](https://github.com/nushell/nushell/issues/new/choose)!
<!--WIP-->
## Developing
### Set up
This is no different than other Rust projects.
```bash
git clone https://github.com/nushell/nushell
cd nushell
cargo build
```
### Useful Commands
Build and run Nushell:
```shell
cargo build --release && cargo run --release
```
Run Clippy on Nushell:
```shell
cargo clippy --all --features=stable
```
Run all tests:
```shell
cargo test --all --features=stable
```
Run all tests for a specific command
```shell
cargo test --package nu-cli --test main -- commands::<command_name_here>
```
Check to see if there are code formatting issues
```shell
cargo fmt --all -- --check
```
Format the code in the project
```shell
cargo fmt --all
```

4856
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,163 +1,140 @@
[package]
name = "nu"
version = "0.3.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
description = "A shell for the GitHub era"
version = "0.16.0"
authors = ["The Nu Project Contributors"]
description = "A new type of shell"
license = "MIT"
edition = "2018"
readme = "README.md"
default-run = "nu"
repository = "https://github.com/nushell/nushell"
homepage = "http://nushell.sh"
documentation = "https://book.nushell.sh"
homepage = "https://www.nushell.sh"
documentation = "https://www.nushell.sh/book/"
exclude = ["images"]
[workspace]
members = ["crates/*/"]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
rustyline = "5.0.3"
chrono = { version = "0.4.9", features = ["serde"] }
derive-new = "0.5.8"
prettytable-rs = "0.8.0"
itertools = "0.8.0"
ansi_term = "0.12.1"
nom = "5.0.0"
dunce = "1.0.0"
indexmap = { version = "1.2.0", features = ["serde-1"] }
chrono-humanize = "0.0.11"
byte-unit = "3.0.1"
base64 = "0.10.1"
futures-preview = { version = "=0.3.0-alpha.18", features = ["compat", "io-compat"] }
futures-async-stream = "=0.1.0-alpha.5"
futures_codec = "0.2.5"
num-traits = "0.2.8"
term = "0.5.2"
bytes = "0.4.12"
nu-cli = { version = "0.16.0", path = "./crates/nu-cli" }
nu-source = { version = "0.16.0", path = "./crates/nu-source" }
nu-plugin = { version = "0.16.0", path = "./crates/nu-plugin" }
nu-protocol = { version = "0.16.0", path = "./crates/nu-protocol" }
nu-errors = { version = "0.16.0", path = "./crates/nu-errors" }
nu-parser = { version = "0.16.0", path = "./crates/nu-parser" }
nu-value-ext = { version = "0.16.0", path = "./crates/nu-value-ext" }
nu_plugin_binaryview = { version = "0.16.0", path = "./crates/nu_plugin_binaryview", optional=true }
nu_plugin_fetch = { version = "0.16.0", path = "./crates/nu_plugin_fetch", optional=true }
nu_plugin_inc = { version = "0.16.0", path = "./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.16.0", path = "./crates/nu_plugin_match", optional=true }
nu_plugin_post = { version = "0.16.0", path = "./crates/nu_plugin_post", optional=true }
nu_plugin_ps = { version = "0.16.0", path = "./crates/nu_plugin_ps", optional=true }
nu_plugin_start = { version = "0.16.0", path = "./crates/nu_plugin_start", optional=true }
nu_plugin_sys = { version = "0.16.0", path = "./crates/nu_plugin_sys", optional=true }
nu_plugin_textview = { version = "0.16.0", path = "./crates/nu_plugin_textview", optional=true }
nu_plugin_tree = { version = "0.16.0", path = "./crates/nu_plugin_tree", optional=true }
crossterm = { version = "0.17.5", optional = true }
semver = { version = "0.10.0", optional = true }
syntect = { version = "4.2", default-features = false, features = ["default-fancy"], optional = true}
url = { version = "2.1.1", optional = true }
clap = "2.33.1"
ctrlc = "3.1.4"
dunce = "1.0.1"
futures = { version = "0.3", features = ["compat", "io-compat"] }
log = "0.4.8"
pretty_env_logger = "0.3.1"
serde = { version = "1.0.100", features = ["derive"] }
bson = { version = "0.14.0", features = ["decimal128"] }
serde_json = "1.0.40"
serde-hjson = "0.9.1"
serde_yaml = "0.8"
serde_bytes = "0.11.2"
getset = "0.0.8"
language-reporting = "0.3.1"
app_dirs = "1.2.1"
csv = "1.1"
toml = "0.5.3"
clap = "2.33.0"
git2 = { version = "0.10.1", default_features = false }
dirs = "2.0.2"
glob = "0.3.0"
ctrlc = "3.1.3"
surf = "1.0.2"
url = "2.1.0"
roxmltree = "0.7.0"
nom_locate = "1.0.0"
enum-utils = "0.1.1"
unicode-xid = "0.2.0"
serde_ini = "0.2.0"
subprocess = "0.1.18"
mime = "0.3.14"
pretty-hex = "0.1.0"
hex = "0.3.2"
tempfile = "3.1.0"
semver = "0.9.0"
which = "2.0.1"
uuid = {version = "0.7.4", features = [ "v4", "serde" ]}
textwrap = {version = "0.11.0", features = ["term_size"]}
shellexpand = "1.0.0"
futures-timer = "0.4.0"
pin-utils = "0.1.0-alpha.4"
num-bigint = { version = "0.2.3", features = ["serde"] }
bigdecimal = { version = "0.1.0", features = ["serde"] }
natural = "0.3.0"
serde_urlencoded = "0.6.1"
sublime_fuzzy = "0.5"
neso = { version = "0.5.0", optional = true }
crossterm = { version = "0.10.2", optional = true }
syntect = {version = "3.2.0", optional = true }
onig_sys = {version = "=69.1.0", optional = true }
heim = {version = "0.0.8-alpha.1", optional = true }
battery = {version = "0.7.4", optional = true }
rawkey = {version = "0.1.2", optional = true }
clipboard = {version = "0.5", optional = true }
ptree = {version = "0.2", optional = true }
image = { version = "0.22.2", default_features = false, features = ["png_codec", "jpeg"], optional = true }
[features]
default = ["textview", "sys", "ps"]
raw-key = ["rawkey", "neso"]
textview = ["syntect", "onig_sys", "crossterm"]
binaryview = ["image", "crossterm"]
sys = ["heim", "battery"]
ps = ["heim"]
[dependencies.rusqlite]
version = "0.20.0"
features = ["bundled", "blob"]
pretty_env_logger = "0.4.0"
starship = "0.43.0"
[dev-dependencies]
pretty_assertions = "0.6.1"
nu-test-support = { version = "0.16.0", path = "./crates/nu-test-support" }
[lib]
name = "nu"
path = "src/lib.rs"
[build-dependencies]
toml = "0.5.6"
serde = { version = "1.0.110", features = ["derive"] }
nu-build = { version = "0.16.0", path = "./crates/nu-build" }
[features]
default = ["sys", "ps", "textview", "inc"]
stable = ["default", "binaryview", "match", "tree", "post", "fetch", "clipboard-cli", "trash-support", "start"]
# Default
textview = ["crossterm", "syntect", "url", "nu_plugin_textview"]
sys = ["nu_plugin_sys"]
ps = ["nu_plugin_ps"]
inc = ["semver", "nu_plugin_inc"]
# Stable
binaryview = ["nu_plugin_binaryview"]
fetch = ["nu_plugin_fetch"]
match = ["nu_plugin_match"]
post = ["nu_plugin_post"]
trace = ["nu-parser/trace"]
tree = ["nu_plugin_tree"]
start = ["nu_plugin_start"]
clipboard-cli = ["nu-cli/clipboard-cli"]
# starship-prompt = ["nu-cli/starship-prompt"]
trash-support = ["nu-cli/trash-support"]
# Core plugins that ship with `cargo install nu` by default
# Currently, Cargo limits us to installing only one binary
# unless we use [[bin]], so we use this as a workaround
[[bin]]
name = "nu_plugin_inc"
path = "src/plugins/inc.rs"
[[bin]]
name = "nu_plugin_sum"
path = "src/plugins/sum.rs"
[[bin]]
name = "nu_plugin_embed"
path = "src/plugins/embed.rs"
[[bin]]
name = "nu_plugin_add"
path = "src/plugins/add.rs"
[[bin]]
name = "nu_plugin_edit"
path = "src/plugins/edit.rs"
[[bin]]
name = "nu_plugin_str"
path = "src/plugins/str.rs"
[[bin]]
name = "nu_plugin_skip"
path = "src/plugins/skip.rs"
[[bin]]
name = "nu_plugin_sys"
path = "src/plugins/sys.rs"
required-features = ["sys"]
[[bin]]
name = "nu_plugin_ps"
path = "src/plugins/ps.rs"
required-features = ["ps"]
[[bin]]
name = "nu_plugin_tree"
path = "src/plugins/tree.rs"
required-features = ["tree"]
[[bin]]
name = "nu_plugin_binaryview"
path = "src/plugins/binaryview.rs"
required-features = ["binaryview"]
[[bin]]
name = "nu_plugin_textview"
path = "src/plugins/textview.rs"
name = "nu_plugin_core_textview"
path = "src/plugins/nu_plugin_core_textview.rs"
required-features = ["textview"]
[[bin]]
name = "nu_plugin_core_inc"
path = "src/plugins/nu_plugin_core_inc.rs"
required-features = ["inc"]
[[bin]]
name = "nu_plugin_core_ps"
path = "src/plugins/nu_plugin_core_ps.rs"
required-features = ["ps"]
[[bin]]
name = "nu_plugin_core_sys"
path = "src/plugins/nu_plugin_core_sys.rs"
required-features = ["sys"]
# Stable plugins
[[bin]]
name = "nu_plugin_stable_fetch"
path = "src/plugins/nu_plugin_stable_fetch.rs"
required-features = ["fetch"]
[[bin]]
name = "nu_plugin_stable_binaryview"
path = "src/plugins/nu_plugin_stable_binaryview.rs"
required-features = ["binaryview"]
[[bin]]
name = "nu_plugin_stable_match"
path = "src/plugins/nu_plugin_stable_match.rs"
required-features = ["match"]
[[bin]]
name = "nu_plugin_stable_post"
path = "src/plugins/nu_plugin_stable_post.rs"
required-features = ["post"]
[[bin]]
name = "nu_plugin_stable_tree"
path = "src/plugins/nu_plugin_stable_tree.rs"
required-features = ["tree"]
[[bin]]
name = "nu_plugin_stable_start"
path = "src/plugins/nu_plugin_stable_start.rs"
required-features = ["start"]
# Main nu binary
[[bin]]
name = "nu"
path = "src/main.rs"

View File

@ -1,6 +1,6 @@
MIT License
Copyright (c) 2019 Yehuda Katz, Jonathan Turner
Copyright (c) 2019 - 2020 Yehuda Katz, Jonathan Turner
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

View File

@ -4,21 +4,12 @@ command = "lalrpop"
args = ["src/parser/parser.lalrpop"]
[tasks.baseline]
dependencies = ["lalrpop"]
[tasks.build]
command = "cargo"
args = ["build"]
dependencies = ["lalrpop"]
args = ["build", "--bins"]
[tasks.run]
command = "cargo"
args = ["run", "--release"]
dependencies = ["baseline"]
[tasks.release]
command = "cargo"
args = ["build", "--release"]
args = ["run"]
dependencies = ["baseline"]
[tasks.test]

1
README.build.txt Normal file
View File

@ -0,0 +1 @@
Nu will look for the plugins in your PATH on startup. While nu will have some functionality without them, for full functionality you'll need to copy them into your path so they can be loaded.

367
README.md
View File

@ -1,35 +1,51 @@
# README
[![Gitpod Ready-to-Code](https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod)](https://gitpod.io/#https://github.com/nushell/nushell)
[![Crates.io](https://img.shields.io/crates/v/nu.svg)](https://crates.io/crates/nu)
[![Build Status](https://dev.azure.com/nushell/nushell/_apis/build/status/nushell.nushell?branchName=master)](https://dev.azure.com/nushell/nushell/_build/latest?definitionId=2&branchName=master)
[![Discord](https://img.shields.io/discord/601130461678272522.svg?logo=discord)](https://discord.gg/NtAbbGn)
[![The Changelog #363](https://img.shields.io/badge/The%20Changelog-%23363-61c192.svg)](https://changelog.com/podcast/363)
## Nu Shell
# Nu Shell
A modern shell for the GitHub era
A new type of shell.
![Example of nushell](images/nushell-autocomplete.gif "Example of nushell")
# Status
## Status
This project has reached a minimum-viable product level of quality. While contributors dogfood it as their daily driver, it may be unstable for some commands. Future releases will work fill out missing features and improve stability. Its design is also subject to change as it matures.
This project has reached a minimum-viable product level of quality.
While contributors dogfood it as their daily driver, it may be unstable for some commands.
Future releases will work to fill out missing features and improve stability.
Its design is also subject to change as it matures.
Nu comes with a set of built-in commands (listed below). If a command is unknown, the command will shell-out and execute it (using cmd on Windows or bash on Linux and MacOS), correctly passing through stdin, stdout and stderr, so things like your daily git workflows and even `vim` will work just fine.
Nu comes with a set of built-in commands (listed below).
If a command is unknown, the command will shell-out and execute it (using cmd on Windows or bash on Linux and macOS), correctly passing through stdin, stdout, and stderr, so things like your daily git workflows and even `vim` will work just fine.
# Learning more
## Learning more
There are a few good resources to learn about Nu. First, there is a [book](https://book.nushell.sh) about Nu, currently in progress. The book focuses on using Nu and its core concepts.
There are a few good resources to learn about Nu.
There is a [book](https://www.nushell.sh/book/) about Nu that is currently in progress.
The book focuses on using Nu and its core concepts.
If you're a developer who would like to contribute to Nu, we're also working on a [book for developers](https://github.com/nushell/contributor-book/tree/master/en) to help get started. There are also [good first issues](https://github.com/nushell/nushell/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) to help you dive in.
If you're a developer who would like to contribute to Nu, we're also working on a [book for developers](https://www.nushell.sh/contributor-book/) to help you get started.
There are also [good first issues](https://github.com/nushell/nushell/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) to help you dive in.
We also have an active [discord](https://discord.gg/NtAbbGn) and [twitter](https://twitter.com/nu_shell) if you'd like to come chat with us.
We also have an active [Discord](https://discord.gg/NtAbbGn) and [Twitter](https://twitter.com/nu_shell) if you'd like to come and chat with us.
# Installation
You can also find more learning resources in our [documentation](https://www.nushell.sh/documentation.html) site.
## Local
Try it in Gitpod.
Up-to-date installation instructions can be found in the [installation chapter of the book](https://book.nushell.sh/en/installation).
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/nushell/nushell)
To build Nu, you will need to use the **nightly** version of the compiler.
## Installation
### Local
Up-to-date installation instructions can be found in the [installation chapter of the book](https://www.nushell.sh/book/en/installation.html). **Windows users**: please note that Nu works on Windows 10 and does not currently have Windows 7/8.1 support.
To build Nu, you will need to use the **latest stable (1.41 or later)** version of the compiler.
Required dependencies:
@ -41,26 +57,36 @@ Optional dependencies:
* To use Nu with all possible optional features enabled, you'll also need the following:
* on Linux (on Debian/Ubuntu): `apt install libxcb-composite0-dev libx11-dev`
To install Nu via cargo:
To install Nu via cargo (make sure you have installed [rustup](https://rustup.rs/) and the latest stable compiler via `rustup install stable`):
```
cargo +nightly install nu
```bash
cargo install nu
```
You can also install Nu with all the bells and whistles:
You can also build Nu yourself with all the bells and whistles (be sure to have installed the [dependencies](https://www.nushell.sh/book/en/installation.html#dependencies) for your platform), once you have checked out this repo with git:
```
cargo +nightly install nu --all-features
```bash
cargo build --workspace --features=stable
```
## Docker
### Docker
#### Quickstart
Want to try Nu right away? Execute the following to get started.
```bash
docker run -it quay.io/nushell/nu:latest
```
#### Guide
If you want to pull a pre-built container, you can browse tags for the [nushell organization](https://quay.io/organization/nushell)
on Quay.io. Pulling a container would come down to:
```bash
$ docker pull quay.io/nushell/nu
$ docker pull quay.io/nushell/nu-base
docker pull quay.io/nushell/nu
docker pull quay.io/nushell/nu-base
```
Both "nu-base" and "nu" provide the nu binary, however nu-base also includes the source code at `/code`
@ -70,132 +96,176 @@ Optionally, you can also build the containers locally using the [dockerfiles pro
To build the base image:
```bash
$ docker build -f docker/Dockerfile.nu-base -t nushell/nu-base .
docker build -f docker/Dockerfile.nu-base -t nushell/nu-base .
```
And then to build the smaller container (using a Multistage build):
```bash
$ docker build -f docker/Dockerfile -t nushell/nu .
docker build -f docker/Dockerfile -t nushell/nu .
```
Either way, you can run either container as follows:
```bash
$ docker run -it nushell/nu-base
$ docker run -it nushell/nu
docker run -it nushell/nu-base
docker run -it nushell/nu
/> exit
```
The second container is a bit smaller, if size is important to you.
The second container is a bit smaller if the size is important to you.
## Packaging status
### Packaging status
### Fedora
[![Packaging status](https://repology.org/badge/vertical-allrepos/nushell.svg)](https://repology.org/project/nushell/versions)
#### Fedora
[COPR repo](https://copr.fedorainfracloud.org/coprs/atim/nushell/): `sudo dnf copr enable atim/nushell -y && sudo dnf install nushell -y`
# Philosophy
## Philosophy
Nu draws inspiration from projects like PowerShell, functional programming languages, and modern cli tools. Rather than thinking of files and services as raw streams of text, Nu looks at each input as something with structure. For example, when you list the contents of a directory, what you get back is a table of rows, where each row represents an item in that directory. These values can be piped through a series of steps, in a series of commands called a 'pipeline'.
Nu draws inspiration from projects like PowerShell, functional programming languages, and modern CLI tools.
Rather than thinking of files and services as raw streams of text, Nu looks at each input as something with structure.
For example, when you list the contents of a directory, what you get back is a table of rows, where each row represents an item in that directory.
These values can be piped through a series of steps, in a series of commands called a 'pipeline'.
## Pipelines
### Pipelines
In Unix, it's common to pipe between commands to split up a sophisticated command over multiple steps. Nu takes this a step further and builds heavily on the idea of _pipelines_. Just as the Unix philosophy, Nu allows commands to output from stdout and read from stdin. Additionally, commands can output structured data (you can think of this as a third kind of stream). Commands that work in the pipeline fit into one of three categories
In Unix, it's common to pipe between commands to split up a sophisticated command over multiple steps.
Nu takes this a step further and builds heavily on the idea of _pipelines_.
Just as the Unix philosophy, Nu allows commands to output from stdout and read from stdin.
Additionally, commands can output structured data (you can think of this as a third kind of stream).
Commands that work in the pipeline fit into one of three categories:
* Commands that produce a stream (eg, `ls`)
* Commands that filter a stream (eg, `where type == "Directory"`)
* Commands that consumes the output of the pipeline (eg, `autoview`)
* Commands that filter a stream (eg, `where type == "Dir"`)
* Commands that consume the output of the pipeline (eg, `autoview`)
Commands are separated by the pipe symbol (`|`) to denote a pipeline flowing left to right.
```
/home/jonathan/Source/nushell(master)> ls | where type == "Directory" | autoview
━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━
# │ name │ type │ readonly │ size │ accessed │ modified
────┼───────────┼───────────┼──────────┼────────┼──────────────┼────────────────
0 │ .azure │ Directory │ │ 4.1 KB │ 2 months ago │ a day ago
1 │ target │ Directory │ │ 4.1 KB │ 3 days ago │ 3 days ago
2 │ images │ Directory │ 4.1 KB │ 2 months ago │ 2 weeks ago
3 │ tests │ Directory │ │ 4.1 KB │ 2 months ago │ 37 minutes ago
4 │ tmp │ Directory │ │ 4.1 KB │ 2 weeks ago │ 2 weeks ago
5 │ src │ Directory │ 4.1 KB │ 2 months ago │ 37 minutes ago
6 │ assets │ Directory │ │ 4.1 KB │ a month ago │ a month ago
7 │ docs │ Directory │ │ 4.1 KB │ 2 months ago │ 2 months ago
━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━
```shell
> ls | where type == "Dir" | autoview
───┬────────┬──────┬───────┬──────────────
# │ name │ type │ size │ modified
───────────┼──────┼───────┼──────────────
0 │ assets │ Dir │ 128 B │ 5 months ago
1 │ crates │ Dir │ 704 B │ 50 mins ago
2 │ debian │ Dir352 B │ 5 months ago
3 │ docker │ Dir │ 288 B │ 3 months ago
4 │ docs │ Dir │ 192 B │ 50 mins ago
5 │ images │ Dir160 B │ 5 months ago
6 │ src │ Dir │ 128 B │ 1 day ago
7 │ target │ Dir │ 160 B │ 5 days ago
8 │ tests │ Dir │ 192 B │ 3 months ago
───┴────────┴──────┴───────┴──────────────
```
Because most of the time you'll want to see the output of a pipeline, `autoview` is assumed. We could have also written the above:
Because most of the time you'll want to see the output of a pipeline, `autoview` is assumed.
We could have also written the above:
```
/home/jonathan/Source/nushell(master)> ls | where type == Directory
```shell
> ls | where type == Dir
```
Being able to use the same commands and compose them differently is an important philosophy in Nu. For example, we could use the built-in `ps` command as well to get a list of the running processes, using the same `where` as above.
```text
/home/jonathan/Source/nushell(master)> ps | where cpu > 0
━━━┯━━━━━━━┯━━━━━━━━━━━━━━━━━┯━━━━━━━━━━┯━━━━━━━━━━
# │ pid │ name │ status │ cpu
───┼───────┼─────────────────┼──────────┼──────────
0 │ 992 │ chrome │ Sleeping │ 6.988768
1 │ 4240 │ chrome │ Sleeping │ 5.645982
2 │ 13973 │ qemu-system-x86 │ Sleeping │ 4.996551
3 │ 15746 │ nu │ Sleeping │ 84.59905
━━━┷━━━━━━━┷━━━━━━━━━━━━━━━━━┷━━━━━━━━━━┷━━━━━━━━━━
Being able to use the same commands and compose them differently is an important philosophy in Nu.
For example, we could use the built-in `ps` command as well to get a list of the running processes, using the same `where` as above.
```shell
> ps | where cpu > 0
───┬────────┬───────────────────┬──────────┬─────────┬──────────┬──────────
# │ pid │ name │ status │ cpu │ mem │ virtual
───┼────────┼───────────────────┼──────────┼─────────┼──────────┼──────────
0435 │ irq/142-SYNA327 │ Sleeping │ 7.5699 │ 0 B │ 0 B
11609 │ pulseaudio │ Sleeping │ 6.5605 │ 10.6 MB │ 2.3 GB
21625 │ gnome-shell │ Sleeping │ 6.5684 │ 639.6 MB │ 7.3 GB
32202 │ Web Content │ Sleeping │ 6.8157 │ 320.8 MB │ 3.0 GB
4328788 │ nu_plugin_core_ps │ Sleeping │ 92.5750 │ 5.9 MB │ 633.2 MB
───┴────────┴───────────────────┴──────────┴─────────┴──────────┴──────────
```
## Opening files
### Opening files
Nu can load file and URL contents as raw text or as structured data (if it recognizes the format). For example, you can load a .toml file as structured data and explore it:
Nu can load file and URL contents as raw text or as structured data (if it recognizes the format).
For example, you can load a .toml file as structured data and explore it:
```
/home/jonathan/Source/nushell(master)> open Cargo.toml
━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━
bin │ dependencies │ dev-dependencies
──────────────────┼────────────────┼──────────────────
[table: 12 rows] │ [table: 1 row] │ [table: 1 row]
━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━
```shell
> open Cargo.toml
────────────────────┬───────────────────────────
bin [table 18 rows]
build-dependencies │ [row nu-build serde toml]
dependencies │ [row 29 columns]
dev-dependencies │ [row nu-test-support]
features │ [row 19 columns]
package │ [row 12 columns]
workspace │ [row members]
────────────────────┴───────────────────────────
```
We can pipeline this into a command that gets the contents of one of the columns:
```
/home/jonathan/Source/nushell(master)> open Cargo.toml | get package
━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━━━━┯━━━━━━┯━━━━━━━━━
authors │ description │ edition │ license │ name │ version
─────────────────┼────────────────────────────┼─────────┼─────────┼──────┼─────────
[table: 3 rows] │ A shell for the GitHub era │ 2018 │ ISC │ nu │ 0.3.0
━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━━━━┷━━━━━━┷━━━━━━━━━
```shell
> open Cargo.toml | get package
───────────────┬────────────────────────────────────
authors [table 1 rows]
default-run │ nu
description │ A new type of shell
documentation │ https://www.nushell.sh/book/
edition │ 2018
exclude │ [table 1 rows]
homepage │ https://www.nushell.sh
license │ MIT
name │ nu
readme │ README.md
repository │ https://github.com/nushell/nushell
version │ 0.15.1
───────────────┴────────────────────────────────────
```
Finally, we can use commands outside of Nu once we have the data we want:
```
/home/jonathan/Source/nushell(master)> open Cargo.toml | get package.version | echo $it
0.3.0
```shell
> open Cargo.toml | get package.version | echo $it
0.15.1
```
Here we use the variable `$it` to refer to the value being piped to the external command.
## Shells
### Configuration
By default, Nu will work inside of a single directory and allow you to navigate around your filesystem. Sometimes, you'll want to work in multiple directories at the same time. For this, Nu offers a way of adding additional working directories that you can jump between.
Nu has early support for configuring the shell. You can refer to the book for a list of [all supported variables](https://www.nushell.sh/book/en/configuration.html).
To do so, use the `enter` command, which will allow you create a new "shell" and enter it at the specified path. You can toggle between this new shell and the original shell with the `p` (for previous) and `n` (for next), allowing you to navigate around a ring buffer of shells. Once you're done with a shell, you can `exit` it and remove it from the ring buffer.
To set one of these variables, you can use `config --set`. For example:
```shell
> config --set [edit_mode "vi"]
> config --set [path $nu.path]
```
### Shells
Nu will work inside of a single directory and allow you to navigate around your filesystem by default.
Nu also offers a way of adding additional working directories that you can jump between, allowing you to work in multiple directories at the same time.
To do so, use the `enter` command, which will allow you create a new "shell" and enter it at the specified path.
You can toggle between this new shell and the original shell with the `p` (for previous) and `n` (for next), allowing you to navigate around a ring buffer of shells.
Once you're done with a shell, you can `exit` it and remove it from the ring buffer.
Finally, to get a list of all the current shells, you can use the `shells` command.
## Plugins
### Plugins
Nu supports plugins that offer additional functionality to the shell and follow the same structured data model that built-in commands use. This allows you to extend nu for your needs.
Nu supports plugins that offer additional functionality to the shell and follow the same structured data model that built-in commands use.
This allows you to extend nu for your needs.
There are a few examples in the `plugins` directory.
Plugins are binaries that are available in your path and follow a "nu_plugin_*" naming convention. These binaries interact with nu via a simple JSON-RPC protocol where the command identifies itself and passes along its configuration, which then makes it available for use. If the plugin is a filter, data streams to it one element at a time, and it can stream data back in return via stdin/stdout. If the plugin is a sink, it is given the full vector of final data and is given free reign over stdin/stdout to use as it pleases.
Plugins are binaries that are available in your path and follow a `nu_plugin_*` naming convention.
These binaries interact with nu via a simple JSON-RPC protocol where the command identifies itself and passes along its configuration, which then makes it available for use.
If the plugin is a filter, data streams to it one element at a time, and it can stream data back in return via stdin/stdout.
If the plugin is a sink, it is given the full vector of final data and is given free reign over stdin/stdout to use as it pleases.
# Goals
## Goals
Nu adheres closely to a set of goals that make up its design philosophy. As features are added, they are checked against these goals.
@ -209,96 +279,39 @@ Nu adheres closely to a set of goals that make up its design philosophy. As feat
* Finally, Nu views data functionally. Rather than using mutation, pipelines act as a means to load, change, and save data without mutable state.
# Commands
## Initial commands
| command | description |
| ------------- | ------------- |
| cd path | Change to a new path |
| cp source path | Copy files |
| date (--utc) | Get the current datetime |
| fetch url | Fetch contents from a url and retrieve data as a table if possible |
| help | Display help information about commands |
| ls (path) | View the contents of the current or given path |
| mkdir path | Make directories, creates intermediary directories as required. |
| mv source target | Move files or directories. |
| open filename | Load a file into a cell, convert to table if possible (avoid by appending '--raw') |
| post url body (--user <user>) (--password <password>) | Post content to a url and retrieve data as a table if possible |
| ps | View current processes |
| sys | View information about the current system |
| which filename | Finds a program file. |
| rm {file or directory} | Remove a file, (for removing directory append '--recursive') |
| version | Display Nu version |
## Commands
## Shell commands
| exit (--now) | Exit the current shell (or all shells) |
| enter (path) | Create a new shell and begin at this path |
| p | Go to previous shell |
| n | Go to next shell |
| shells | Display the list of current shells |
You can find a list of Nu commands, complete with documentation, in [quick command references](https://www.nushell.sh/documentation.html#quick-command-references).
## Filters on tables (structured data)
| command | description |
| ------------- | ------------- |
| add column-or-column-path value | Add a new column to the table |
| edit column-or-column-path value | Edit an existing column to have a new value |
| embed column | Creates a new table of one column with the given name, and places the current table inside of it |
| first amount | Show only the first number of rows |
| get column-or-column-path | Open column and get data from the corresponding cells |
| inc (column-or-column-path) | Increment a value or version. Optionally use the column of a table |
| last amount | Show only the last number of rows |
| nth row-number | Return only the selected row |
| pick ...columns | Down-select table to only these columns |
| pivot --header-row <headers> | Pivot the tables, making columns into rows and vice versa |
| reject ...columns | Remove the given columns from the table |
| reverse | Reverses the table. |
| skip amount | Skip a number of rows |
| skip-while condition | Skips rows while the condition matches. |
| sort-by ...columns | Sort by the given columns |
| str (column) | Apply string function. Optionally use the column of a table |
| sum | Sum a column of values |
| tags | Read the tags (metadata) for values |
| to-bson | Convert table into .bson binary data |
| to-csv | Convert table into .csv text |
| to-json | Convert table into .json text |
| to-sqlite | Convert table to sqlite .db binary data |
| to-toml | Convert table into .toml text |
| to-tsv | Convert table into .tsv text |
| to-url | Convert table to a urlencoded string |
| to-yaml | Convert table into .yaml text |
| where condition | Filter table to match the condition |
## Progress
## Filters on text (unstructured data)
| command | description |
| ------------- | ------------- |
| from-bson | Parse binary data as .bson and create table |
| from-csv | Parse text as .csv and create table |
| from-ini | Parse text as .ini and create table |
| from-json | Parse text as .json and create table |
| from-sqlite | Parse binary data as sqlite .db and create table |
| from-toml | Parse text as .toml and create table |
| from-tsv | Parse text as .tsv and create table |
| from-url | Parse urlencoded string and create a table |
| from-xml | Parse text as .xml and create a table |
| from-yaml | Parse text as a .yaml/.yml and create a table |
| lines | Split single string into rows, one per line |
| size | Gather word count statistics on the text |
| split-column sep ...column-names | Split row contents across multiple columns via the separator, optionally give the columns names |
| split-row sep | Split row contents over multiple rows via the separator |
| trim | Trim leading and following whitespace from text data |
| {external-command} $it | Run external command with given arguments, replacing $it with each row text |
Nu is in heavy development, and will naturally change as it matures and people use it. The chart below isn't meant to be exhaustive, but rather helps give an idea for some of the areas of development and their relative completion:
## Consuming commands
| command | description |
| ------------- | ------------- |
| autoview | View the contents of the pipeline as a table or list |
| binaryview | Autoview of binary data (optional feature) |
| clip | Copy the contents of the pipeline to the copy/paste buffer (optional feature) |
| save filename | Save the contents of the pipeline to a file |
| table | View the contents of the pipeline as a table |
| textview | Autoview of text data |
| tree | View the contents of the pipeline as a tree (optional feature) |
| Features | Not started | Prototype | MVP | Preview | Mature | Notes
| -------- |:-----------:|:---------:|:---:|:-------:|:------:| -----
| Aliases | | X | | | | Initial implementation but lacks necessary features
| Notebook | | X | | | | Initial jupyter support, but it loses state and lacks features
| File ops | | | X | | | cp, mv, rm, mkdir have some support, but lacking others
| Environment | | X | | | | Temporary environment, but no session-wide env variables
| Shells | | X | | | | Basic value and file shells, but no opt-in/opt-out for commands
| Protocol | | | X | | | Streaming protocol is serviceable
| Plugins | | X | | | | Plugins work on one row at a time, lack batching and expression eval
| Errors | | | X | | | Error reporting works, but could use usability polish
| Documentation | | X | | | | Book and related are barebones and lack task-based lessons
| Paging | | X | | | | Textview has paging, but we'd like paging for tables
| Functions| X | | | | | No functions, yet, only aliases
| Variables| X | | | | | Nu doesn't yet support variables
| Completions | | X | | | | Completions are currently barebones, at best
| Type-checking | | X | | | | Commands check basic types, but input/output isn't checked
# License
## Current Roadmap
The project is made available under the MIT license. See "LICENSE" for more information.
We've added a `Roadmap Board` to help collaboratively capture the direction we're going for the current release as well as capture some important issues we'd like to see in NuShell. You can find the Roadmap [here](https://github.com/nushell/nushell/projects/2).
## Contributing
See [Contributing](CONTRIBUTING.md) for details.
## License
The project is made available under the MIT license. See the `LICENSE` file for more information.

3
build.rs Normal file
View File

@ -0,0 +1,3 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -0,0 +1,16 @@
[package]
name = "nu-build"
version = "0.16.0"
authors = ["The Nu Project Contributors"]
edition = "2018"
description = "Core build system for nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
serde = { version = "1.0.114", features = ["derive"] }
lazy_static = "1.4.0"
serde_json = "1.0.55"
toml = "0.5.6"

View File

@ -0,0 +1,80 @@
use lazy_static::lazy_static;
use serde::Deserialize;
use std::collections::BTreeMap;
use std::collections::HashMap;
use std::collections::HashSet;
use std::env;
use std::path::{Path, PathBuf};
use std::sync::Mutex;
lazy_static! {
static ref WORKSPACES: Mutex<BTreeMap<String, &'static Path>> = Mutex::new(BTreeMap::new());
}
// got from https://github.com/mitsuhiko/insta/blob/b113499249584cb650150d2d01ed96ee66db6b30/src/runtime.rs#L67-L88
fn get_cargo_workspace(manifest_dir: &str) -> Result<Option<&Path>, Box<dyn std::error::Error>> {
let mut workspaces = WORKSPACES.lock()?;
if let Some(rv) = workspaces.get(manifest_dir) {
Ok(Some(rv))
} else {
#[derive(Deserialize)]
struct Manifest {
workspace_root: String,
}
let output = std::process::Command::new(env!("CARGO"))
.arg("metadata")
.arg("--format-version=1")
.current_dir(manifest_dir)
.output()?;
let manifest: Manifest = serde_json::from_slice(&output.stdout)?;
let path = Box::leak(Box::new(PathBuf::from(manifest.workspace_root)));
workspaces.insert(manifest_dir.to_string(), path.as_path());
Ok(workspaces.get(manifest_dir).cloned())
}
}
#[derive(Deserialize)]
struct Feature {
#[allow(unused)]
description: String,
enabled: bool,
}
pub fn build() -> Result<(), Box<dyn std::error::Error>> {
let input = env::var("CARGO_MANIFEST_DIR")?;
let all_on = env::var("NUSHELL_ENABLE_ALL_FLAGS").is_ok();
let flags: HashSet<String> = env::var("NUSHELL_ENABLE_FLAGS")
.map(|s| s.split(',').map(|s| s.to_string()).collect())
.unwrap_or_else(|_| HashSet::new());
if all_on && !flags.is_empty() {
println!(
"cargo:warning=Both NUSHELL_ENABLE_ALL_FLAGS and NUSHELL_ENABLE_FLAGS were set. You don't need both."
);
}
let workspace = match get_cargo_workspace(&input)? {
// If the crate is being downloaded from crates.io, it won't have a workspace root, and that's ok
None => return Ok(()),
Some(workspace) => workspace,
};
let path = Path::new(&workspace).join("features.toml");
// If the crate is being downloaded from crates.io, it won't have a features.toml, and that's ok
if !path.exists() {
return Ok(());
}
let toml: HashMap<String, Feature> = toml::from_str(&std::fs::read_to_string(path)?)?;
for (key, value) in toml.iter() {
if value.enabled || all_on || flags.contains(key) {
println!("cargo:rustc-cfg={}", key);
}
}
Ok(())
}

115
crates/nu-cli/Cargo.toml Normal file
View File

@ -0,0 +1,115 @@
[package]
name = "nu-cli"
version = "0.16.0"
authors = ["The Nu Project Contributors"]
description = "CLI for nushell"
edition = "2018"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-source = { version = "0.16.0", path = "../nu-source" }
nu-plugin = { version = "0.16.0", path = "../nu-plugin" }
nu-protocol = { version = "0.16.0", path = "../nu-protocol" }
nu-errors = { version = "0.16.0", path = "../nu-errors" }
nu-parser = { version = "0.16.0", path = "../nu-parser" }
nu-value-ext = { version = "0.16.0", path = "../nu-value-ext" }
nu-test-support = { version = "0.16.0", path = "../nu-test-support" }
nu-table = {version = "0.16.0", path = "../nu-table"}
ansi_term = "0.12.1"
app_dirs = "1.2.1"
async-recursion = "0.3.1"
async-trait = "0.1.36"
directories = "2.0.2"
base64 = "0.12.3"
bigdecimal = { version = "0.1.2", features = ["serde"] }
bson = { version = "0.14.1", features = ["decimal128"] }
byte-unit = "3.1.3"
bytes = "0.5.5"
calamine = "0.16"
cfg-if = "0.1"
chrono = { version = "0.4.11", features = ["serde"] }
clap = "2.33.1"
csv = "1.1"
ctrlc = "3.1.4"
derive-new = "0.5.8"
dirs = "2.0.2"
dunce = "1.0.1"
eml-parser = "0.1.0"
filesize = "0.2.0"
futures = { version = "0.3", features = ["compat", "io-compat"] }
futures-util = "0.3.5"
futures_codec = "0.4"
getset = "0.1.1"
git2 = { version = "0.13.6", default_features = false }
glob = "0.3.0"
hex = "0.4"
htmlescape = "0.3.1"
ical = "0.6.*"
ichwh = "0.3.4"
indexmap = { version = "1.4.0", features = ["serde-1"] }
itertools = "0.9.0"
codespan-reporting = "0.9.5"
log = "0.4.8"
meval = "0.2"
natural = "0.5.0"
num-bigint = { version = "0.2.6", features = ["serde"] }
num-traits = "0.2.11"
parking_lot = "0.11.0"
pin-utils = "0.1.0"
pretty-hex = "0.1.1"
pretty_env_logger = "0.4.0"
ptree = {version = "0.2" }
query_interface = "0.3.5"
rand = "0.7"
regex = "1"
roxmltree = "0.13.0"
rustyline = "6.2.0"
serde = { version = "1.0.114", features = ["derive"] }
serde-hjson = "0.9.1"
serde_bytes = "0.11.5"
serde_ini = "0.2.0"
serde_json = "1.0.55"
serde_urlencoded = "0.6.1"
serde_yaml = "0.8"
shellexpand = "2.0.0"
strip-ansi-escapes = "0.1.0"
tempfile = "3.1.0"
term = "0.5.2"
termcolor = "1.1.0"
term_size = "0.3.2"
toml = "0.5.6"
typetag = "0.1.5"
umask = "1.0.0"
unicode-xid = "0.2.1"
uuid_crate = { package = "uuid", version = "0.8.1", features = ["v4"] }
which = "4.0.1"
trash = { version = "1.0.1", optional = true }
clipboard = { version = "0.5", optional = true }
starship = "0.43.0"
rayon = "1.3.1"
encoding_rs = "0.8.23"
[target.'cfg(unix)'.dependencies]
users = "0.10.0"
[dependencies.rusqlite]
version = "0.23.1"
features = ["bundled", "blob"]
[build-dependencies]
nu-build = { version = "0.16.0", path = "../nu-build" }
[dev-dependencies]
quickcheck = "0.9"
quickcheck_macros = "0.9"
[features]
stable = []
# starship-prompt = ["starship"]
clipboard-cli = ["clipboard"]
trash-support = ["trash"]

1070
crates/nu-cli/src/cli.rs Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,272 @@
#[macro_use]
pub(crate) mod macros;
mod from_delimited_data;
mod to_delimited_data;
pub(crate) mod alias;
pub(crate) mod ansi;
pub(crate) mod append;
pub(crate) mod args;
pub(crate) mod autoview;
pub(crate) mod build_string;
pub(crate) mod cal;
pub(crate) mod calc;
pub(crate) mod cd;
pub(crate) mod char_;
pub(crate) mod classified;
#[cfg(feature = "clipboard")]
pub(crate) mod clip;
pub(crate) mod command;
pub(crate) mod compact;
pub(crate) mod config;
pub(crate) mod count;
pub(crate) mod cp;
pub(crate) mod date;
pub(crate) mod debug;
pub(crate) mod default;
pub(crate) mod do_;
pub(crate) mod drop;
pub(crate) mod du;
pub(crate) mod each;
pub(crate) mod echo;
pub(crate) mod enter;
#[allow(unused)]
pub(crate) mod evaluate_by;
pub(crate) mod every;
pub(crate) mod exit;
pub(crate) mod first;
pub(crate) mod format;
pub(crate) mod from;
pub(crate) mod from_bson;
pub(crate) mod from_csv;
pub(crate) mod from_eml;
pub(crate) mod from_ics;
pub(crate) mod from_ini;
pub(crate) mod from_json;
pub(crate) mod from_ods;
pub(crate) mod from_sqlite;
pub(crate) mod from_ssv;
pub(crate) mod from_toml;
pub(crate) mod from_tsv;
pub(crate) mod from_url;
pub(crate) mod from_vcf;
pub(crate) mod from_xlsx;
pub(crate) mod from_xml;
pub(crate) mod from_yaml;
pub(crate) mod get;
pub(crate) mod group_by;
pub(crate) mod group_by_date;
pub(crate) mod headers;
pub(crate) mod help;
pub(crate) mod histogram;
pub(crate) mod history;
pub(crate) mod insert;
pub(crate) mod is_empty;
pub(crate) mod keep;
pub(crate) mod keep_until;
pub(crate) mod keep_while;
pub(crate) mod last;
pub(crate) mod lines;
pub(crate) mod ls;
#[allow(unused)]
pub(crate) mod map_max_by;
pub(crate) mod math;
pub(crate) mod merge;
pub(crate) mod mkdir;
pub(crate) mod mv;
pub(crate) mod next;
pub(crate) mod nth;
pub(crate) mod open;
pub(crate) mod parse;
pub(crate) mod pivot;
pub(crate) mod plugin;
pub(crate) mod prepend;
pub(crate) mod prev;
pub(crate) mod pwd;
pub(crate) mod random;
pub(crate) mod range;
#[allow(unused)]
pub(crate) mod reduce_by;
pub(crate) mod reject;
pub(crate) mod rename;
pub(crate) mod reverse;
pub(crate) mod rm;
pub(crate) mod run_alias;
pub(crate) mod run_external;
pub(crate) mod save;
pub(crate) mod select;
pub(crate) mod shells;
pub(crate) mod shuffle;
pub(crate) mod size;
pub(crate) mod skip;
pub(crate) mod skip_until;
pub(crate) mod skip_while;
pub(crate) mod sort_by;
pub(crate) mod split;
pub(crate) mod split_by;
pub(crate) mod str_;
#[allow(unused)]
pub(crate) mod t_sort_by;
pub(crate) mod table;
pub(crate) mod tags;
pub(crate) mod to;
pub(crate) mod to_bson;
pub(crate) mod to_csv;
pub(crate) mod to_html;
pub(crate) mod to_json;
pub(crate) mod to_md;
pub(crate) mod to_sqlite;
pub(crate) mod to_toml;
pub(crate) mod to_tsv;
pub(crate) mod to_url;
pub(crate) mod to_yaml;
pub(crate) mod trim;
pub(crate) mod uniq;
pub(crate) mod update;
pub(crate) mod version;
pub(crate) mod what;
pub(crate) mod where_;
pub(crate) mod which_;
pub(crate) mod with_env;
pub(crate) mod wrap;
pub(crate) use autoview::Autoview;
pub(crate) use cd::Cd;
pub(crate) use command::{
whole_stream_command, Command, Example, UnevaluatedCallInfo, WholeStreamCommand,
};
pub(crate) use alias::Alias;
pub(crate) use ansi::Ansi;
pub(crate) use append::Append;
pub(crate) use build_string::BuildString;
pub(crate) use cal::Cal;
pub(crate) use calc::Calc;
pub(crate) use char_::Char;
pub(crate) use compact::Compact;
pub(crate) use config::Config;
pub(crate) use count::Count;
pub(crate) use cp::Cpy;
pub(crate) use date::Date;
pub(crate) use debug::Debug;
pub(crate) use default::Default;
pub(crate) use do_::Do;
pub(crate) use drop::Drop;
pub(crate) use du::Du;
pub(crate) use each::Each;
pub(crate) use echo::Echo;
pub(crate) use is_empty::IsEmpty;
pub(crate) use update::Update;
pub(crate) mod kill;
pub(crate) use kill::Kill;
pub(crate) mod clear;
pub(crate) use clear::Clear;
pub(crate) mod touch;
pub(crate) use enter::Enter;
#[allow(unused_imports)]
pub(crate) use evaluate_by::EvaluateBy;
pub(crate) use every::Every;
pub(crate) use exit::Exit;
pub(crate) use first::First;
pub(crate) use format::Format;
pub(crate) use from::From;
pub(crate) use from_bson::FromBSON;
pub(crate) use from_csv::FromCSV;
pub(crate) use from_eml::FromEML;
pub(crate) use from_ics::FromIcs;
pub(crate) use from_ini::FromINI;
pub(crate) use from_json::FromJSON;
pub(crate) use from_ods::FromODS;
pub(crate) use from_sqlite::FromDB;
pub(crate) use from_sqlite::FromSQLite;
pub(crate) use from_ssv::FromSSV;
pub(crate) use from_toml::FromTOML;
pub(crate) use from_tsv::FromTSV;
pub(crate) use from_url::FromURL;
pub(crate) use from_vcf::FromVcf;
pub(crate) use from_xlsx::FromXLSX;
pub(crate) use from_xml::FromXML;
pub(crate) use from_yaml::FromYAML;
pub(crate) use from_yaml::FromYML;
pub(crate) use get::Get;
pub(crate) use group_by::GroupBy;
pub(crate) use group_by_date::GroupByDate;
pub(crate) use headers::Headers;
pub(crate) use help::Help;
pub(crate) use histogram::Histogram;
pub(crate) use history::History;
pub(crate) use insert::Insert;
pub(crate) use keep::Keep;
pub(crate) use keep_until::KeepUntil;
pub(crate) use keep_while::KeepWhile;
pub(crate) use last::Last;
pub(crate) use lines::Lines;
pub(crate) use ls::Ls;
#[allow(unused_imports)]
pub(crate) use map_max_by::MapMaxBy;
pub(crate) use math::{
Math, MathAverage, MathMaximum, MathMedian, MathMinimum, MathMode, MathSummation,
};
pub(crate) use merge::Merge;
pub(crate) use mkdir::Mkdir;
pub(crate) use mv::Move;
pub(crate) use next::Next;
pub(crate) use nth::Nth;
pub(crate) use open::Open;
pub(crate) use parse::Parse;
pub(crate) use pivot::Pivot;
pub(crate) use prepend::Prepend;
pub(crate) use prev::Previous;
pub(crate) use pwd::Pwd;
pub(crate) use random::{Random, RandomBool, RandomDice, RandomUUID};
pub(crate) use range::Range;
#[allow(unused_imports)]
pub(crate) use reduce_by::ReduceBy;
pub(crate) use reject::Reject;
pub(crate) use rename::Rename;
pub(crate) use reverse::Reverse;
pub(crate) use rm::Remove;
pub(crate) use run_external::RunExternalCommand;
pub(crate) use save::Save;
pub(crate) use select::Select;
pub(crate) use shells::Shells;
pub(crate) use shuffle::Shuffle;
pub(crate) use size::Size;
pub(crate) use skip::Skip;
pub(crate) use skip_until::SkipUntil;
pub(crate) use skip_while::SkipWhile;
pub(crate) use sort_by::SortBy;
pub(crate) use split::Split;
pub(crate) use split::SplitColumn;
pub(crate) use split::SplitRow;
pub(crate) use split_by::SplitBy;
pub(crate) use str_::{
Str, StrCapitalize, StrCollect, StrDowncase, StrFindReplace, StrSet, StrSubstring,
StrToDatetime, StrToDecimal, StrToInteger, StrTrim, StrUpcase,
};
#[allow(unused_imports)]
pub(crate) use t_sort_by::TSortBy;
pub(crate) use table::Table;
pub(crate) use tags::Tags;
pub(crate) use to::To;
pub(crate) use to_bson::ToBSON;
pub(crate) use to_csv::ToCSV;
pub(crate) use to_html::ToHTML;
pub(crate) use to_json::ToJSON;
pub(crate) use to_md::ToMarkdown;
pub(crate) use to_sqlite::ToDB;
pub(crate) use to_sqlite::ToSQLite;
pub(crate) use to_toml::ToTOML;
pub(crate) use to_tsv::ToTSV;
pub(crate) use to_url::ToURL;
pub(crate) use to_yaml::ToYAML;
pub(crate) use touch::Touch;
pub(crate) use trim::Trim;
pub(crate) use uniq::Uniq;
pub(crate) use version::Version;
pub(crate) use what::What;
pub(crate) use where_::Where;
pub(crate) use which_::Which;
pub(crate) use with_env::WithEnv;
pub(crate) use wrap::Wrap;

View File

@ -0,0 +1,151 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::data::config;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{
hir::Block, CommandAction, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
};
use nu_source::Tagged;
pub struct Alias;
#[derive(Deserialize)]
pub struct AliasArgs {
pub name: Tagged<String>,
pub args: Vec<Value>,
pub block: Block,
pub save: Option<bool>,
}
#[async_trait]
impl WholeStreamCommand for Alias {
fn name(&self) -> &str {
"alias"
}
fn signature(&self) -> Signature {
Signature::build("alias")
.required("name", SyntaxShape::String, "the name of the alias")
.required("args", SyntaxShape::Table, "the arguments to the alias")
.required(
"block",
SyntaxShape::Block,
"the block to run as the body of the alias",
)
.switch("save", "save the alias to your config", Some('s'))
}
fn usage(&self) -> &str {
"Define a shortcut for another command."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
alias(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "An alias without parameters",
example: "alias say-hi [] { echo 'Hello!' }",
result: None,
},
Example {
description: "An alias with a single parameter",
example: "alias l [x] { ls $x }",
result: None,
},
]
}
}
pub async fn alias(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let mut raw_input = args.raw_input.clone();
let (
AliasArgs {
name,
args: list,
block,
save,
},
_ctx,
) = args.process(&registry).await?;
let mut processed_args: Vec<String> = vec![];
if let Some(true) = save {
let mut result = crate::data::config::read(name.clone().tag, &None)?;
// process the alias to remove the --save flag
let left_brace = raw_input.find('{').unwrap_or(0);
let right_brace = raw_input.rfind('}').unwrap_or_else(|| raw_input.len());
let left = raw_input[..left_brace]
.replace("--save", "")
.replace("-s", "");
let right = raw_input[right_brace..]
.replace("--save", "")
.replace("-s", "");
raw_input = format!("{}{}{}", left, &raw_input[left_brace..right_brace], right);
// create a value from raw_input alias
let alias: Value = raw_input.trim().to_string().into();
let alias_start = raw_input.find('[').unwrap_or(0); // used to check if the same alias already exists
// add to startup if alias doesn't exist and replce if it does
match result.get_mut("startup") {
Some(startup) => {
if let UntaggedValue::Table(ref mut commands) = startup.value {
if let Some(command) = commands.iter_mut().find(|command| {
let cmd_str = command.as_string().unwrap_or_default();
cmd_str.starts_with(&raw_input[..alias_start])
}) {
*command = alias;
} else {
commands.push(alias);
}
}
}
None => {
let table = UntaggedValue::table(&[alias]);
result.insert("startup".to_string(), table.into_value(Tag::default()));
}
}
config::write(&result, &None)?;
}
for item in list.iter() {
if let Ok(string) = item.as_string() {
processed_args.push(format!("${}", string));
} else {
return Err(ShellError::labeled_error(
"Expected a string",
"expected a string",
item.tag(),
));
}
}
Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::AddAlias(name.to_string(), processed_args, block),
)))
}
#[cfg(test)]
mod tests {
use super::Alias;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Alias {})
}
}

View File

@ -0,0 +1,140 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
pub struct Ansi;
#[derive(Deserialize)]
struct AnsiArgs {
color: Value,
}
#[async_trait]
impl WholeStreamCommand for Ansi {
fn name(&self) -> &str {
"ansi"
}
fn signature(&self) -> Signature {
Signature::build("ansi").required(
"color",
SyntaxShape::Any,
"the name of the color to use or 'reset' to reset the color",
)
}
fn usage(&self) -> &str {
"Output ANSI codes to change color"
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Change color to green",
example: r#"ansi green"#,
result: Some(vec![Value::from("\u{1b}[32m")]),
},
Example {
description: "Reset the color",
example: r#"ansi reset"#,
result: Some(vec![Value::from("\u{1b}[0m")]),
},
]
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let (AnsiArgs { color }, _) = args.process(&registry).await?;
let color_string = color.as_string()?;
let ansi_code = str_to_ansi_color(color_string);
if let Some(output) = ansi_code {
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(output).into_value(color.tag()),
)))
} else {
Err(ShellError::labeled_error(
"Unknown color",
"unknown color",
color.tag(),
))
}
}
}
fn str_to_ansi_color(s: String) -> Option<String> {
match s.as_str() {
"g" | "green" => Some(ansi_term::Color::Green.prefix().to_string()),
"gb" | "green_bold" => Some(ansi_term::Color::Green.bold().prefix().to_string()),
"gu" | "green_underline" => Some(ansi_term::Color::Green.underline().prefix().to_string()),
"gi" | "green_italic" => Some(ansi_term::Color::Green.italic().prefix().to_string()),
"gd" | "green_dimmed" => Some(ansi_term::Color::Green.dimmed().prefix().to_string()),
"gr" | "green_reverse" => Some(ansi_term::Color::Green.reverse().prefix().to_string()),
"r" | "red" => Some(ansi_term::Color::Red.prefix().to_string()),
"rb" | "red_bold" => Some(ansi_term::Color::Red.bold().prefix().to_string()),
"ru" | "red_underline" => Some(ansi_term::Color::Red.underline().prefix().to_string()),
"ri" | "red_italic" => Some(ansi_term::Color::Red.italic().prefix().to_string()),
"rd" | "red_dimmed" => Some(ansi_term::Color::Red.dimmed().prefix().to_string()),
"rr" | "red_reverse" => Some(ansi_term::Color::Red.reverse().prefix().to_string()),
"u" | "blue" => Some(ansi_term::Color::Blue.prefix().to_string()),
"ub" | "blue_bold" => Some(ansi_term::Color::Blue.bold().prefix().to_string()),
"uu" | "blue_underline" => Some(ansi_term::Color::Blue.underline().prefix().to_string()),
"ui" | "blue_italic" => Some(ansi_term::Color::Blue.italic().prefix().to_string()),
"ud" | "blue_dimmed" => Some(ansi_term::Color::Blue.dimmed().prefix().to_string()),
"ur" | "blue_reverse" => Some(ansi_term::Color::Blue.reverse().prefix().to_string()),
"b" | "black" => Some(ansi_term::Color::Black.prefix().to_string()),
"bb" | "black_bold" => Some(ansi_term::Color::Black.bold().prefix().to_string()),
"bu" | "black_underline" => Some(ansi_term::Color::Black.underline().prefix().to_string()),
"bi" | "black_italic" => Some(ansi_term::Color::Black.italic().prefix().to_string()),
"bd" | "black_dimmed" => Some(ansi_term::Color::Black.dimmed().prefix().to_string()),
"br" | "black_reverse" => Some(ansi_term::Color::Black.reverse().prefix().to_string()),
"y" | "yellow" => Some(ansi_term::Color::Yellow.prefix().to_string()),
"yb" | "yellow_bold" => Some(ansi_term::Color::Yellow.bold().prefix().to_string()),
"yu" | "yellow_underline" => {
Some(ansi_term::Color::Yellow.underline().prefix().to_string())
}
"yi" | "yellow_italic" => Some(ansi_term::Color::Yellow.italic().prefix().to_string()),
"yd" | "yellow_dimmed" => Some(ansi_term::Color::Yellow.dimmed().prefix().to_string()),
"yr" | "yellow_reverse" => Some(ansi_term::Color::Yellow.reverse().prefix().to_string()),
"p" | "purple" => Some(ansi_term::Color::Purple.prefix().to_string()),
"pb" | "purple_bold" => Some(ansi_term::Color::Purple.bold().prefix().to_string()),
"pu" | "purple_underline" => {
Some(ansi_term::Color::Purple.underline().prefix().to_string())
}
"pi" | "purple_italic" => Some(ansi_term::Color::Purple.italic().prefix().to_string()),
"pd" | "purple_dimmed" => Some(ansi_term::Color::Purple.dimmed().prefix().to_string()),
"pr" | "purple_reverse" => Some(ansi_term::Color::Purple.reverse().prefix().to_string()),
"c" | "cyan" => Some(ansi_term::Color::Cyan.prefix().to_string()),
"cb" | "cyan_bold" => Some(ansi_term::Color::Cyan.bold().prefix().to_string()),
"cu" | "cyan_underline" => Some(ansi_term::Color::Cyan.underline().prefix().to_string()),
"ci" | "cyan_italic" => Some(ansi_term::Color::Cyan.italic().prefix().to_string()),
"cd" | "cyan_dimmed" => Some(ansi_term::Color::Cyan.dimmed().prefix().to_string()),
"cr" | "cyan_reverse" => Some(ansi_term::Color::Cyan.reverse().prefix().to_string()),
"w" | "white" => Some(ansi_term::Color::White.prefix().to_string()),
"wb" | "white_bold" => Some(ansi_term::Color::White.bold().prefix().to_string()),
"wu" | "white_underline" => Some(ansi_term::Color::White.underline().prefix().to_string()),
"wi" | "white_italic" => Some(ansi_term::Color::White.italic().prefix().to_string()),
"wd" | "white_dimmed" => Some(ansi_term::Color::White.dimmed().prefix().to_string()),
"wr" | "white_reverse" => Some(ansi_term::Color::White.reverse().prefix().to_string()),
"reset" => Some("\x1b[0m".to_owned()),
_ => None,
}
}
#[cfg(test)]
mod tests {
use super::Ansi;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Ansi {})
}
}

View File

@ -0,0 +1,68 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue, Value};
#[derive(Deserialize)]
struct AppendArgs {
row: Value,
}
pub struct Append;
#[async_trait]
impl WholeStreamCommand for Append {
fn name(&self) -> &str {
"append"
}
fn signature(&self) -> Signature {
Signature::build("append").required(
"row value",
SyntaxShape::Any,
"the value of the row to append to the table",
)
}
fn usage(&self) -> &str {
"Append the given row to the table"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let (AppendArgs { row }, input) = args.process(registry).await?;
let eos = futures::stream::iter(vec![row]);
Ok(input.chain(eos).to_output_stream())
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Add something to the end of a list or table",
example: "echo [1 2 3] | append 4",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(2).into(),
UntaggedValue::int(3).into(),
UntaggedValue::int(4).into(),
]),
}]
}
}
#[cfg(test)]
mod tests {
use super::Append;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Append {})
}
}

View File

@ -1,4 +1,4 @@
use crate::data::Value;
use nu_protocol::Value;
#[derive(Debug)]
pub enum LogLevel {}

View File

@ -0,0 +1,349 @@
use crate::commands::UnevaluatedCallInfo;
use crate::commands::WholeStreamCommand;
use crate::data::value::format_leaf;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{hir, hir::Expression, hir::Literal, hir::SpannedExpression};
use nu_protocol::{Primitive, Scope, Signature, UntaggedValue, Value};
use parking_lot::Mutex;
use std::sync::atomic::AtomicBool;
pub struct Autoview;
#[async_trait]
impl WholeStreamCommand for Autoview {
fn name(&self) -> &str {
"autoview"
}
fn signature(&self) -> Signature {
Signature::build("autoview")
}
fn usage(&self) -> &str {
"View the contents of the pipeline as a table or list."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
autoview(RunnableContext {
input: args.input,
registry: registry.clone(),
shell_manager: args.shell_manager,
host: args.host,
ctrl_c: args.ctrl_c,
current_errors: args.current_errors,
name: args.call_info.name_tag,
raw_input: args.raw_input,
})
.await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Automatically view the results",
example: "ls | autoview",
result: None,
},
Example {
description: "Autoview is also implied. The above can be written as",
example: "ls",
result: None,
},
]
}
}
pub struct RunnableContextWithoutInput {
pub shell_manager: ShellManager,
pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>,
pub current_errors: Arc<Mutex<Vec<ShellError>>>,
pub ctrl_c: Arc<AtomicBool>,
pub registry: CommandRegistry,
pub name: Tag,
}
impl RunnableContextWithoutInput {
pub fn convert(context: RunnableContext) -> (InputStream, RunnableContextWithoutInput) {
let new_context = RunnableContextWithoutInput {
shell_manager: context.shell_manager,
host: context.host,
ctrl_c: context.ctrl_c,
current_errors: context.current_errors,
registry: context.registry,
name: context.name,
};
(context.input, new_context)
}
}
pub async fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
let binary = context.get_command("binaryview");
let text = context.get_command("textview");
let table = context.get_command("table");
#[derive(PartialEq)]
enum AutoPivotMode {
Auto,
Always,
Never,
}
let pivot_mode = crate::data::config::config(Tag::unknown());
let pivot_mode = if let Some(v) = pivot_mode?.get("pivot_mode") {
match v.as_string() {
Ok(m) if m.to_lowercase() == "auto" => AutoPivotMode::Auto,
Ok(m) if m.to_lowercase() == "always" => AutoPivotMode::Always,
Ok(m) if m.to_lowercase() == "never" => AutoPivotMode::Never,
_ => AutoPivotMode::Always,
}
} else {
AutoPivotMode::Always
};
let (mut input_stream, context) = RunnableContextWithoutInput::convert(context);
let term_width = context.host.lock().width();
if let Some(x) = input_stream.next().await {
match input_stream.next().await {
Some(y) => {
let ctrl_c = context.ctrl_c.clone();
let xy = vec![x, y];
let xy_stream = futures::stream::iter(xy)
.chain(input_stream)
.interruptible(ctrl_c);
let stream = InputStream::from_stream(xy_stream);
if let Some(table) = table {
let command_args = create_default_command_args(&context).with_input(stream);
let result = table.run(command_args, &context.registry).await?;
result.collect::<Vec<_>>().await;
}
}
_ => {
match x {
Value {
value: UntaggedValue::Primitive(Primitive::String(ref s)),
tag: Tag { anchor, span },
} if anchor.is_some() => {
if let Some(text) = text {
let mut stream = VecDeque::new();
stream.push_back(
UntaggedValue::string(s).into_value(Tag { anchor, span }),
);
let command_args =
create_default_command_args(&context).with_input(stream);
let result = text.run(command_args, &context.registry).await?;
result.collect::<Vec<_>>().await;
} else {
out!("{}", s);
}
}
Value {
value: UntaggedValue::Primitive(Primitive::String(s)),
..
} => {
out!("{}", s);
}
Value {
value: UntaggedValue::Primitive(Primitive::Line(ref s)),
tag: Tag { anchor, span },
} if anchor.is_some() => {
if let Some(text) = text {
let mut stream = VecDeque::new();
stream.push_back(
UntaggedValue::string(s).into_value(Tag { anchor, span }),
);
let command_args =
create_default_command_args(&context).with_input(stream);
let result = text.run(command_args, &context.registry).await?;
result.collect::<Vec<_>>().await;
} else {
out!("{}\n", s);
}
}
Value {
value: UntaggedValue::Primitive(Primitive::Line(s)),
..
} => {
out!("{}\n", s);
}
Value {
value: UntaggedValue::Primitive(Primitive::Path(s)),
..
} => {
out!("{}", s.display());
}
Value {
value: UntaggedValue::Primitive(Primitive::Int(n)),
..
} => {
out!("{}", n);
}
Value {
value: UntaggedValue::Primitive(Primitive::Decimal(n)),
..
} => {
// TODO: normalize decimal to remove trailing zeros.
// normalization will be available in next release of bigdecimal crate
let mut output = n.to_string();
if output.contains('.') {
output = output.trim_end_matches('0').to_owned();
}
if output.ends_with('.') {
output.push('0');
}
out!("{}", output);
}
Value {
value: UntaggedValue::Primitive(Primitive::Boolean(b)),
..
} => {
out!("{}", b);
}
Value {
value: UntaggedValue::Primitive(Primitive::Duration(_)),
..
} => {
let output = format_leaf(&x).plain_string(100_000);
out!("{}", output);
}
Value {
value: UntaggedValue::Primitive(Primitive::Date(d)),
..
} => {
out!("{}", d);
}
Value {
value: UntaggedValue::Primitive(Primitive::Range(_)),
..
} => {
let output = format_leaf(&x).plain_string(100_000);
out!("{}", output);
}
Value {
value: UntaggedValue::Primitive(Primitive::Binary(ref b)),
..
} => {
if let Some(binary) = binary {
let mut stream = VecDeque::new();
stream.push_back(x);
let command_args =
create_default_command_args(&context).with_input(stream);
let result = binary.run(command_args, &context.registry).await?;
result.collect::<Vec<_>>().await;
} else {
use pretty_hex::*;
out!("{:?}", b.hex_dump());
}
}
Value {
value: UntaggedValue::Error(e),
..
} => {
return Err(e);
}
Value {
value: UntaggedValue::Row(row),
..
} if pivot_mode == AutoPivotMode::Always
|| (pivot_mode == AutoPivotMode::Auto
&& (row
.entries
.iter()
.map(|(_, v)| v.convert_to_string())
.collect::<Vec<_>>()
.iter()
.fold(0usize, |acc, len| acc + len.len())
+ row.entries.iter().count() * 2)
> term_width) =>
{
let mut entries = vec![];
for (key, value) in row.entries.iter() {
entries.push(vec![
nu_table::StyledString::new(
key.to_string(),
nu_table::TextStyle {
alignment: nu_table::Alignment::Left,
color: Some(ansi_term::Color::Green),
is_bold: true,
},
),
nu_table::StyledString::new(
format_leaf(value).plain_string(100_000),
nu_table::TextStyle::basic(),
),
]);
}
let table =
nu_table::Table::new(vec![], entries, nu_table::Theme::compact());
nu_table::draw_table(&table, term_width);
}
Value {
value: ref item, ..
} => {
if let Some(table) = table {
let mut stream = VecDeque::new();
stream.push_back(x);
let command_args =
create_default_command_args(&context).with_input(stream);
let result = table.run(command_args, &context.registry).await?;
result.collect::<Vec<_>>().await;
} else {
out!("{:?}", item);
}
}
}
}
}
}
Ok(OutputStream::empty())
}
fn create_default_command_args(context: &RunnableContextWithoutInput) -> RawCommandArgs {
let span = context.name.span;
RawCommandArgs {
host: context.host.clone(),
ctrl_c: context.ctrl_c.clone(),
current_errors: context.current_errors.clone(),
shell_manager: context.shell_manager.clone(),
call_info: UnevaluatedCallInfo {
args: hir::Call {
head: Box::new(SpannedExpression::new(
Expression::Literal(Literal::String(String::new())),
span,
)),
positional: None,
named: None,
span,
is_last: true,
},
name_tag: context.name.clone(),
scope: Scope::new(),
},
}
}
#[cfg(test)]
mod tests {
use super::Autoview;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Autoview {})
}
}

View File

@ -0,0 +1,56 @@
use crate::prelude::*;
use nu_errors::ShellError;
use crate::commands::WholeStreamCommand;
use crate::data::value::format_leaf;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
#[derive(Deserialize)]
pub struct BuildStringArgs {
rest: Vec<Value>,
}
pub struct BuildString;
#[async_trait]
impl WholeStreamCommand for BuildString {
fn name(&self) -> &str {
"build-string"
}
fn signature(&self) -> Signature {
Signature::build("build-string")
.rest(SyntaxShape::Any, "all values to form into the string")
}
fn usage(&self) -> &str {
"Builds a string from the arguments"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let (BuildStringArgs { rest }, _) = args.process(&registry).await?;
let mut output_string = String::new();
for r in rest {
output_string.push_str(&format_leaf(&r).plain_string(100_000))
}
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(output_string).into_value(tag),
)))
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Builds a string from a string and a number, without spaces between them",
example: "build-string 'foo' 3",
result: None,
}]
}
}

View File

@ -0,0 +1,352 @@
use crate::commands::{command::EvaluatedWholeStreamCommandArgs, WholeStreamCommand};
use crate::prelude::*;
use chrono::{Datelike, Local, NaiveDate};
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_protocol::{Dictionary, Signature, SyntaxShape, UntaggedValue, Value};
pub struct Cal;
#[async_trait]
impl WholeStreamCommand for Cal {
fn name(&self) -> &str {
"cal"
}
fn signature(&self) -> Signature {
Signature::build("cal")
.switch("year", "Display the year column", Some('y'))
.switch("quarter", "Display the quarter column", Some('q'))
.switch("month", "Display the month column", Some('m'))
.named(
"full-year",
SyntaxShape::Int,
"Display a year-long calendar for the specified year",
None,
)
.named(
"week-start",
SyntaxShape::String,
"Display the calendar with the specified day as the first day of the week",
None,
)
.switch(
"month-names",
"Display the month names instead of integers",
None,
)
}
fn usage(&self) -> &str {
"Display a calendar."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
cal(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "This month's calendar",
example: "cal",
result: None,
},
Example {
description: "The calendar for all of 2012",
example: "cal --full-year 2012",
result: None,
},
Example {
description: "This month's calendar with the week starting on monday",
example: "cal --week-start monday",
result: None,
},
]
}
}
pub async fn cal(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let mut calendar_vec_deque = VecDeque::new();
let tag = args.call_info.name_tag.clone();
let (current_year, current_month, current_day) = get_current_date();
let mut selected_year: i32 = current_year;
let mut current_day_option: Option<u32> = Some(current_day);
let month_range = if let Some(full_year_value) = args.get("full-year") {
if let Ok(year_u64) = full_year_value.as_u64() {
selected_year = year_u64 as i32;
if selected_year != current_year {
current_day_option = None
}
} else {
return Err(get_invalid_year_shell_error(&full_year_value.tag()));
}
(1, 12)
} else {
(current_month, current_month)
};
let add_months_of_year_to_table_result = add_months_of_year_to_table(
&args,
&mut calendar_vec_deque,
&tag,
selected_year,
month_range,
current_month,
current_day_option,
);
match add_months_of_year_to_table_result {
Ok(()) => Ok(futures::stream::iter(calendar_vec_deque).to_output_stream()),
Err(error) => Err(error),
}
}
fn get_invalid_year_shell_error(year_tag: &Tag) -> ShellError {
ShellError::labeled_error("The year is invalid", "invalid year", year_tag)
}
struct MonthHelper {
selected_year: i32,
selected_month: u32,
day_number_of_week_month_starts_on: u32,
number_of_days_in_month: u32,
quarter_number: u32,
month_name: String,
}
impl MonthHelper {
pub fn new(selected_year: i32, selected_month: u32) -> Result<MonthHelper, ()> {
let naive_date = NaiveDate::from_ymd_opt(selected_year, selected_month, 1).ok_or(())?;
let number_of_days_in_month =
MonthHelper::calculate_number_of_days_in_month(selected_year, selected_month)?;
Ok(MonthHelper {
selected_year,
selected_month,
day_number_of_week_month_starts_on: naive_date.weekday().num_days_from_sunday(),
number_of_days_in_month,
quarter_number: ((selected_month - 1) / 3) + 1,
month_name: naive_date.format("%B").to_string().to_ascii_lowercase(),
})
}
fn calculate_number_of_days_in_month(
mut selected_year: i32,
mut selected_month: u32,
) -> Result<u32, ()> {
// Chrono does not provide a method to output the amount of days in a month
// This is a workaround taken from the example code from the Chrono docs here:
// https://docs.rs/chrono/0.3.0/chrono/naive/date/struct.NaiveDate.html#example-30
if selected_month == 12 {
selected_year += 1;
selected_month = 1;
} else {
selected_month += 1;
};
let next_month_naive_date =
NaiveDate::from_ymd_opt(selected_year, selected_month, 1).ok_or(())?;
Ok(next_month_naive_date.pred().day())
}
}
fn get_current_date() -> (i32, u32, u32) {
let local_now_date = Local::now().date();
let current_year: i32 = local_now_date.year();
let current_month: u32 = local_now_date.month();
let current_day: u32 = local_now_date.day();
(current_year, current_month, current_day)
}
fn add_months_of_year_to_table(
args: &EvaluatedWholeStreamCommandArgs,
mut calendar_vec_deque: &mut VecDeque<Value>,
tag: &Tag,
selected_year: i32,
(start_month, end_month): (u32, u32),
current_month: u32,
current_day_option: Option<u32>,
) -> Result<(), ShellError> {
for month_number in start_month..=end_month {
let mut new_current_day_option: Option<u32> = None;
if let Some(current_day) = current_day_option {
if month_number == current_month {
new_current_day_option = Some(current_day)
}
}
let add_month_to_table_result = add_month_to_table(
&args,
&mut calendar_vec_deque,
&tag,
selected_year,
month_number,
new_current_day_option,
);
add_month_to_table_result?
}
Ok(())
}
fn add_month_to_table(
args: &EvaluatedWholeStreamCommandArgs,
calendar_vec_deque: &mut VecDeque<Value>,
tag: &Tag,
selected_year: i32,
current_month: u32,
current_day_option: Option<u32>,
) -> Result<(), ShellError> {
let month_helper_result = MonthHelper::new(selected_year, current_month);
let month_helper = match month_helper_result {
Ok(month_helper) => month_helper,
Err(()) => match args.get("full-year") {
Some(full_year_value) => {
return Err(get_invalid_year_shell_error(&full_year_value.tag()))
}
None => {
return Err(ShellError::labeled_error(
"Issue parsing command",
"invalid command",
tag,
))
}
},
};
let mut days_of_the_week = [
"sunday",
"monday",
"tuesday",
"wednesday",
"thursday",
"friday",
"saturday",
];
let mut week_start_day = days_of_the_week[0].to_string();
if let Some(week_start_value) = args.get("week-start") {
if let Ok(day) = week_start_value.as_string() {
if days_of_the_week.contains(&day.as_str()) {
week_start_day = day;
} else {
return Err(ShellError::labeled_error(
"The specified week start day is invalid",
"invalid week start day",
week_start_value.tag(),
));
}
}
}
let week_start_day_offset = days_of_the_week.len()
- days_of_the_week
.iter()
.position(|day| *day == week_start_day)
.unwrap_or(0);
days_of_the_week.rotate_right(week_start_day_offset);
let mut total_start_offset: u32 =
month_helper.day_number_of_week_month_starts_on + week_start_day_offset as u32;
total_start_offset %= days_of_the_week.len() as u32;
let mut day_number: u32 = 1;
let day_limit: u32 = total_start_offset + month_helper.number_of_days_in_month;
let should_show_year_column = args.has("year");
let should_show_quarter_column = args.has("quarter");
let should_show_month_column = args.has("month");
let should_show_month_names = args.has("month-names");
while day_number <= day_limit {
let mut indexmap = IndexMap::new();
if should_show_year_column {
indexmap.insert(
"year".to_string(),
UntaggedValue::int(month_helper.selected_year).into_value(tag),
);
}
if should_show_quarter_column {
indexmap.insert(
"quarter".to_string(),
UntaggedValue::int(month_helper.quarter_number).into_value(tag),
);
}
if should_show_month_column || should_show_month_names {
let month_value = if should_show_month_names {
UntaggedValue::string(month_helper.month_name.clone()).into_value(tag)
} else {
UntaggedValue::int(month_helper.selected_month).into_value(tag)
};
indexmap.insert("month".to_string(), month_value);
}
for day in &days_of_the_week {
let should_add_day_number_to_table =
(day_number > total_start_offset) && (day_number <= day_limit);
let mut value = UntaggedValue::nothing().into_value(tag);
if should_add_day_number_to_table {
let adjusted_day_number = day_number - total_start_offset;
value = UntaggedValue::int(adjusted_day_number).into_value(tag);
if let Some(current_day) = current_day_option {
if current_day == adjusted_day_number {
// TODO: Update the value here with a color when color support is added
// This colors the current day
}
}
}
indexmap.insert((*day).to_string(), value);
day_number += 1;
}
calendar_vec_deque
.push_back(UntaggedValue::Row(Dictionary::from(indexmap)).into_value(tag));
}
Ok(())
}
#[cfg(test)]
mod tests {
use super::Cal;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Cal {})
}
}

View File

@ -0,0 +1,88 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, UntaggedValue, Value};
pub struct Calc;
#[async_trait]
impl WholeStreamCommand for Calc {
fn name(&self) -> &str {
"calc"
}
fn usage(&self) -> &str {
"Parse a math expression into a number"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
calc(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Calculate math in the pipeline",
example: "echo '10 / 4' | calc",
result: Some(vec![UntaggedValue::decimal(2.5).into()]),
}]
}
}
pub async fn calc(
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let input = args.input;
let name = args.call_info.name_tag.span;
Ok(input
.map(move |input| {
if let Ok(string) = input.as_string() {
match parse(&string, &input.tag) {
Ok(value) => ReturnSuccess::value(value),
Err(err) => Err(ShellError::labeled_error(
"Calculation error",
err,
&input.tag.span,
)),
}
} else {
Err(ShellError::labeled_error(
"Expected a string from pipeline",
"requires string input",
name,
))
}
})
.to_output_stream())
}
pub fn parse(math_expression: &str, tag: impl Into<Tag>) -> Result<Value, String> {
use std::f64;
let num = meval::eval_str(math_expression);
match num {
Ok(num) => {
if num == f64::INFINITY || num == f64::NEG_INFINITY {
return Err(String::from("cannot represent result"));
}
Ok(UntaggedValue::from(Primitive::from(num)).into_value(tag))
}
Err(error) => Err(error.to_string()),
}
}
#[cfg(test)]
mod tests {
use super::Calc;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Calc {})
}
}

View File

@ -0,0 +1,82 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use std::path::PathBuf;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape};
use nu_source::Tagged;
#[derive(Deserialize)]
pub struct CdArgs {
pub(crate) path: Option<Tagged<PathBuf>>,
}
pub struct Cd;
#[async_trait]
impl WholeStreamCommand for Cd {
fn name(&self) -> &str {
"cd"
}
fn signature(&self) -> Signature {
Signature::build("cd").optional(
"directory",
SyntaxShape::Path,
"the directory to change to",
)
}
fn usage(&self) -> &str {
"Change to a new path."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = args.call_info.name_tag.clone();
let shell_manager = args.shell_manager.clone();
let (args, _): (CdArgs, _) = args.process(&registry).await?;
shell_manager.cd(args, name)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Change to a new directory called 'dirname'",
example: "cd dirname",
result: None,
},
Example {
description: "Change to your home directory",
example: "cd",
result: None,
},
Example {
description: "Change to your home directory (alternate version)",
example: "cd ~",
result: None,
},
Example {
description: "Change to the previous directory",
example: "cd -",
result: None,
},
]
}
}
#[cfg(test)]
mod tests {
use super::Cd;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Cd {})
}
}

View File

@ -0,0 +1,81 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct Char;
#[derive(Deserialize)]
struct CharArgs {
name: Tagged<String>,
}
#[async_trait]
impl WholeStreamCommand for Char {
fn name(&self) -> &str {
"char"
}
fn signature(&self) -> Signature {
Signature::build("ansi").required(
"character",
SyntaxShape::Any,
"the name of the character to output",
)
}
fn usage(&self) -> &str {
"Output special characters (eg. 'newline')"
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Output newline",
example: r#"char newline"#,
result: Some(vec![Value::from("\n")]),
}]
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let (CharArgs { name }, _) = args.process(&registry).await?;
let special_character = str_to_character(&name.item);
if let Some(output) = special_character {
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(output).into_value(name.tag()),
)))
} else {
Err(ShellError::labeled_error(
"Unknown character",
"unknown character",
name.tag(),
))
}
}
}
fn str_to_character(s: &str) -> Option<String> {
match s {
"newline" | "enter" | "nl" => Some("\n".into()),
"tab" => Some("\t".into()),
_ => None,
}
}
#[cfg(test)]
mod tests {
use super::Char;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Char {})
}
}

View File

@ -0,0 +1,98 @@
use crate::commands::classified::expr::run_expression_block;
use crate::commands::classified::internal::run_internal_command;
use crate::context::Context;
use crate::prelude::*;
use crate::stream::InputStream;
use futures::stream::TryStreamExt;
use nu_errors::ShellError;
use nu_protocol::hir::{Block, ClassifiedCommand, Commands};
use nu_protocol::{ReturnSuccess, UntaggedValue, Value};
use std::sync::atomic::Ordering;
pub(crate) async fn run_block(
block: &Block,
ctx: &mut Context,
mut input: InputStream,
it: &Value,
vars: &IndexMap<String, Value>,
env: &IndexMap<String, String>,
) -> Result<InputStream, ShellError> {
let mut output: Result<InputStream, ShellError> = Ok(InputStream::empty());
for pipeline in &block.block {
match output {
Ok(inp) if inp.is_empty() => {}
Ok(inp) => {
let mut output_stream = inp.to_output_stream();
loop {
match output_stream.try_next().await {
Ok(Some(ReturnSuccess::Value(Value {
value: UntaggedValue::Error(e),
..
}))) => return Err(e),
Ok(Some(_item)) => {
if let Some(err) = ctx.get_errors().get(0) {
ctx.clear_errors();
return Err(err.clone());
}
if ctx.ctrl_c.load(Ordering::SeqCst) {
break;
}
}
Ok(None) => {
if let Some(err) = ctx.get_errors().get(0) {
ctx.clear_errors();
return Err(err.clone());
}
break;
}
Err(e) => return Err(e),
}
}
}
Err(e) => {
return Err(e);
}
}
output = run_pipeline(pipeline, ctx, input, it, vars, env).await;
input = InputStream::empty();
}
output
}
async fn run_pipeline(
commands: &Commands,
ctx: &mut Context,
mut input: InputStream,
it: &Value,
vars: &IndexMap<String, Value>,
env: &IndexMap<String, String>,
) -> Result<InputStream, ShellError> {
let mut iter = commands.list.clone().into_iter().peekable();
loop {
let item: Option<ClassifiedCommand> = iter.next();
let next: Option<&ClassifiedCommand> = iter.peek();
input = match (item, next) {
(Some(ClassifiedCommand::Dynamic(_)), _) | (_, Some(ClassifiedCommand::Dynamic(_))) => {
return Err(ShellError::unimplemented("Dynamic commands"))
}
(Some(ClassifiedCommand::Expr(expr)), _) => {
run_expression_block(*expr, ctx, it, vars, env).await?
}
(Some(ClassifiedCommand::Error(err)), _) => return Err(err.into()),
(_, Some(ClassifiedCommand::Error(err))) => return Err(err.clone().into()),
(Some(ClassifiedCommand::Internal(left)), _) => {
run_internal_command(left, ctx, input, it, vars, env).await?
}
(None, _) => break,
};
}
Ok(input)
}

View File

@ -0,0 +1,7 @@
use derive_new::new;
use nu_protocol::hir;
#[derive(new, Debug)]
pub(crate) struct Command {
pub(crate) args: hir::Call,
}

View File

@ -0,0 +1,27 @@
use crate::evaluate::evaluate_baseline_expr;
use crate::prelude::*;
use log::{log_enabled, trace};
use futures::stream::once;
use nu_errors::ShellError;
use nu_protocol::hir::SpannedExpression;
use nu_protocol::Value;
pub(crate) async fn run_expression_block(
expr: SpannedExpression,
context: &mut Context,
it: &Value,
vars: &IndexMap<String, Value>,
env: &IndexMap<String, String>,
) -> Result<InputStream, ShellError> {
if log_enabled!(log::Level::Trace) {
trace!(target: "nu::run::expr", "->");
trace!(target: "nu::run::expr", "{:?}", expr);
}
let registry = context.registry().clone();
let output = evaluate_baseline_expr(&expr, &registry, it, vars, env).await?;
Ok(once(async { Ok(output) }).to_input_stream())
}

View File

@ -0,0 +1,702 @@
use crate::evaluate::evaluate_baseline_expr;
use crate::futures::ThreadedReceiver;
use crate::prelude::*;
use std::io::Write;
use std::ops::Deref;
use std::process::{Command, Stdio};
use std::sync::mpsc;
use bytes::{BufMut, Bytes, BytesMut};
use futures::executor::block_on_stream;
// use futures::stream::StreamExt;
use futures_codec::FramedRead;
use log::trace;
use nu_errors::ShellError;
use nu_protocol::hir::ExternalCommand;
use nu_protocol::{Primitive, Scope, ShellTypeName, UntaggedValue, Value};
use nu_source::Tag;
pub enum StringOrBinary {
String(String),
Binary(Vec<u8>),
}
pub struct MaybeTextCodec;
impl futures_codec::Encoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = std::io::Error;
fn encode(&mut self, item: Self::Item, dst: &mut BytesMut) -> Result<(), Self::Error> {
match item {
StringOrBinary::String(s) => {
dst.reserve(s.len());
dst.put(s.as_bytes());
Ok(())
}
StringOrBinary::Binary(b) => {
dst.reserve(b.len());
dst.put(Bytes::from(b));
Ok(())
}
}
}
}
impl futures_codec::Decoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = std::io::Error;
fn decode(&mut self, src: &mut BytesMut) -> Result<Option<Self::Item>, Self::Error> {
let v: Vec<u8> = src.to_vec();
match String::from_utf8(v) {
Ok(s) => {
src.clear();
if s.is_empty() {
Ok(None)
} else {
Ok(Some(StringOrBinary::String(s)))
}
}
Err(err) => {
// Note: the longest UTF-8 character per Unicode spec is currently 6 bytes. If we fail somewhere earlier than the last 6 bytes,
// we know that we're failing to understand the string encoding and not just seeing a partial character. When this happens, let's
// fall back to assuming it's a binary buffer.
if src.is_empty() {
Ok(None)
} else if src.len() > 6 && (src.len() - err.utf8_error().valid_up_to() > 6) {
// Fall back to assuming binary
let buf = src.to_vec();
src.clear();
Ok(Some(StringOrBinary::Binary(buf)))
} else {
// Looks like a utf-8 string, so let's assume that
let buf = src.split_to(err.utf8_error().valid_up_to() + 1);
String::from_utf8(buf.to_vec())
.map(|x| Some(StringOrBinary::String(x)))
.map_err(|e| std::io::Error::new(std::io::ErrorKind::InvalidData, e))
}
}
}
}
}
pub(crate) async fn run_external_command(
command: ExternalCommand,
context: &mut Context,
input: InputStream,
scope: &Scope,
is_last: bool,
) -> Result<InputStream, ShellError> {
trace!(target: "nu::run::external", "-> {}", command.name);
if !did_find_command(&command.name).await {
return Err(ShellError::labeled_error(
"Command not found",
"command not found",
&command.name_tag,
));
}
run_with_stdin(command, context, input, scope, is_last).await
}
async fn run_with_stdin(
command: ExternalCommand,
context: &mut Context,
input: InputStream,
scope: &Scope,
is_last: bool,
) -> Result<InputStream, ShellError> {
let path = context.shell_manager.path();
let input = trace_stream!(target: "nu::trace_stream::external::stdin", "input" = input);
let mut command_args = vec![];
for arg in command.args.iter() {
let value =
evaluate_baseline_expr(arg, &context.registry, &scope.it, &scope.vars, &scope.env)
.await?;
// Skip any arguments that don't really exist, treating them as optional
// FIXME: we may want to preserve the gap in the future, though it's hard to say
// what value we would put in its place.
if value.value.is_none() {
continue;
}
// Do the cleanup that we need to do on any argument going out:
let trimmed_value_string = value.as_string()?.trim_end_matches('\n').to_string();
let value_string;
#[cfg(not(windows))]
{
value_string = trimmed_value_string
.replace('$', "\\$")
.replace('"', "\\\"")
.to_string()
}
#[cfg(windows)]
{
value_string = trimmed_value_string
}
command_args.push(value_string);
}
let process_args = command_args
.iter()
.map(|arg| {
let arg = expand_tilde(arg.deref(), dirs::home_dir);
#[cfg(not(windows))]
{
if argument_contains_whitespace(&arg) && !argument_is_quoted(&arg) {
add_quotes(&arg)
} else {
arg.as_ref().to_string()
}
}
#[cfg(windows)]
{
if let Some(unquoted) = remove_quotes(&arg) {
unquoted.to_string()
} else {
arg.as_ref().to_string()
}
}
})
.collect::<Vec<String>>();
spawn(&command, &path, &process_args[..], input, is_last, scope)
}
fn spawn(
command: &ExternalCommand,
path: &str,
args: &[String],
input: InputStream,
is_last: bool,
scope: &Scope,
) -> Result<InputStream, ShellError> {
let command = command.clone();
let mut process = {
#[cfg(windows)]
{
let mut process = Command::new("cmd");
process.arg("/c");
process.arg(&command.name);
for arg in args {
// Clean the args before we use them:
let arg = arg.replace("|", "\\|");
process.arg(&arg);
}
process
}
#[cfg(not(windows))]
{
let cmd_with_args = vec![command.name.clone(), args.join(" ")].join(" ");
let mut process = Command::new("sh");
process.arg("-c").arg(cmd_with_args);
process
}
};
process.current_dir(path);
trace!(target: "nu::run::external", "cwd = {:?}", &path);
process.env_clear();
process.envs(scope.env.iter());
// We want stdout regardless of what
// we are doing ($it case or pipe stdin)
if !is_last {
process.stdout(Stdio::piped());
trace!(target: "nu::run::external", "set up stdout pipe");
process.stderr(Stdio::piped());
trace!(target: "nu::run::external", "set up stderr pipe");
}
// open since we have some contents for stdin
if !input.is_empty() {
process.stdin(Stdio::piped());
trace!(target: "nu::run::external", "set up stdin pipe");
}
trace!(target: "nu::run::external", "built command {:?}", process);
// TODO Switch to async_std::process once it's stabilized
if let Ok(mut child) = process.spawn() {
let (tx, rx) = mpsc::sync_channel(0);
let mut stdin = child.stdin.take();
let stdin_write_tx = tx.clone();
let stdout_read_tx = tx;
let stdin_name_tag = command.name_tag.clone();
let stdout_name_tag = command.name_tag;
std::thread::spawn(move || {
if !input.is_empty() {
let mut stdin_write = stdin
.take()
.expect("Internal error: could not get stdin pipe for external command");
for value in block_on_stream(input) {
match &value.value {
UntaggedValue::Primitive(Primitive::Nothing) => continue,
UntaggedValue::Primitive(Primitive::String(s))
| UntaggedValue::Primitive(Primitive::Line(s)) => {
if let Err(e) = stdin_write.write(s.as_bytes()) {
let message = format!("Unable to write to stdin (error = {})", e);
let _ = stdin_write_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
message,
"application may have closed before completing pipeline",
&stdin_name_tag,
)),
tag: stdin_name_tag,
}));
return Err(());
}
}
UntaggedValue::Primitive(Primitive::Binary(b)) => {
if let Err(e) = stdin_write.write(b) {
let message = format!("Unable to write to stdin (error = {})", e);
let _ = stdin_write_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
message,
"application may have closed before completing pipeline",
&stdin_name_tag,
)),
tag: stdin_name_tag,
}));
return Err(());
}
}
unsupported => {
let _ = stdin_write_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
format!(
"Received unexpected type from pipeline ({})",
unsupported.type_name()
),
"expected a string",
stdin_name_tag.clone(),
)),
tag: stdin_name_tag,
}));
return Err(());
}
};
}
}
Ok(())
});
std::thread::spawn(move || {
if !is_last {
let stdout = if let Some(stdout) = child.stdout.take() {
stdout
} else {
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
"Can't redirect the stdout for external command",
"can't redirect stdout",
&stdout_name_tag,
)),
tag: stdout_name_tag,
}));
return Err(());
};
let stderr = if let Some(stderr) = child.stderr.take() {
stderr
} else {
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
"Can't redirect the stderr for external command",
"can't redirect stderr",
&stdout_name_tag,
)),
tag: stdout_name_tag,
}));
return Err(());
};
let file = futures::io::AllowStdIo::new(stdout);
let stream = FramedRead::new(file, MaybeTextCodec);
for line in block_on_stream(stream) {
match line {
Ok(line) => match line {
StringOrBinary::String(s) => {
let result = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Primitive(Primitive::String(s.clone())),
tag: stdout_name_tag.clone(),
}));
if result.is_err() {
break;
}
}
StringOrBinary::Binary(b) => {
let result = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Primitive(Primitive::Binary(
b.into_iter().collect(),
)),
tag: stdout_name_tag.clone(),
}));
if result.is_err() {
break;
}
}
},
Err(e) => {
// If there's an exit status, it makes sense that we may error when
// trying to read from its stdout pipe (likely been closed). In that
// case, don't emit an error.
let should_error = match child.wait() {
Ok(exit_status) => !exit_status.success(),
Err(_) => true,
};
if should_error {
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
format!("Unable to read from stdout ({})", e),
"unable to read from stdout",
&stdout_name_tag,
)),
tag: stdout_name_tag.clone(),
}));
}
return Ok(());
}
}
}
let file = futures::io::AllowStdIo::new(stderr);
let err_stream = FramedRead::new(file, MaybeTextCodec);
for err_line in block_on_stream(err_stream) {
match err_line {
Ok(line) => match line {
StringOrBinary::String(s) => {
let result = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(
ShellError::untagged_runtime_error(s.clone()),
),
tag: stdout_name_tag.clone(),
}));
if result.is_err() {
break;
}
}
StringOrBinary::Binary(_) => {
let result = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(
ShellError::untagged_runtime_error(
"Binary in stderr output",
),
),
tag: stdout_name_tag.clone(),
}));
if result.is_err() {
break;
}
}
},
Err(e) => {
// If there's an exit status, it makes sense that we may error when
// trying to read from its stdout pipe (likely been closed). In that
// case, don't emit an error.
let should_error = match child.wait() {
Ok(exit_status) => !exit_status.success(),
Err(_) => true,
};
if should_error {
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
format!("Unable to read from stderr ({})", e),
"unable to read from stderr",
&stdout_name_tag,
)),
tag: stdout_name_tag.clone(),
}));
}
return Ok(());
}
}
}
}
// We can give an error when we see a non-zero exit code, but this is different
// than what other shells will do.
let external_failed = match child.wait() {
Err(_) => true,
Ok(exit_status) => !exit_status.success(),
};
if external_failed {
let cfg = crate::data::config::config(Tag::unknown());
if let Ok(cfg) = cfg {
if cfg.contains_key("nonzero_exit_errors") {
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
"External command failed",
"command failed",
&stdout_name_tag,
)),
tag: stdout_name_tag.clone(),
}));
}
}
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::external_non_zero()),
tag: stdout_name_tag,
}));
}
Ok(())
});
let stream = ThreadedReceiver::new(rx);
Ok(stream.to_input_stream())
} else {
Err(ShellError::labeled_error(
"Failed to spawn process",
"failed to spawn",
&command.name_tag,
))
}
}
async fn did_find_command(name: &str) -> bool {
#[cfg(not(windows))]
{
which::which(name).is_ok()
}
#[cfg(windows)]
{
if which::which(name).is_ok() {
true
} else {
let cmd_builtins = [
"call", "cls", "color", "date", "dir", "echo", "find", "hostname", "pause",
"start", "time", "title", "ver", "copy", "mkdir", "rename", "rd", "rmdir", "type",
"mklink",
];
cmd_builtins.contains(&name)
}
}
}
fn expand_tilde<SI: ?Sized, P, HD>(input: &SI, home_dir: HD) -> std::borrow::Cow<str>
where
SI: AsRef<str>,
P: AsRef<std::path::Path>,
HD: FnOnce() -> Option<P>,
{
shellexpand::tilde_with_context(input, home_dir)
}
#[allow(unused)]
pub fn argument_contains_whitespace(argument: &str) -> bool {
argument.chars().any(|c| c.is_whitespace())
}
fn argument_is_quoted(argument: &str) -> bool {
if argument.len() < 2 {
return false;
}
(argument.starts_with('"') && argument.ends_with('"'))
|| (argument.starts_with('\'') && argument.ends_with('\''))
}
#[allow(unused)]
fn add_quotes(argument: &str) -> String {
format!("\"{}\"", argument)
}
#[allow(unused)]
fn remove_quotes(argument: &str) -> Option<&str> {
if !argument_is_quoted(argument) {
return None;
}
let size = argument.len();
Some(&argument[1..size - 1])
}
#[allow(unused)]
fn shell_os_paths() -> Vec<std::path::PathBuf> {
let mut original_paths = vec![];
if let Some(paths) = std::env::var_os("PATH") {
original_paths = std::env::split_paths(&paths).collect::<Vec<_>>();
}
original_paths
}
#[cfg(test)]
mod tests {
use super::{
add_quotes, argument_contains_whitespace, argument_is_quoted, expand_tilde, remove_quotes,
run_external_command, Context, InputStream,
};
use futures::executor::block_on;
use nu_errors::ShellError;
use nu_protocol::Scope;
use nu_test_support::commands::ExternalBuilder;
// async fn read(mut stream: OutputStream) -> Option<Value> {
// match stream.try_next().await {
// Ok(val) => {
// if let Some(val) = val {
// val.raw_value()
// } else {
// None
// }
// }
// Err(_) => None,
// }
// }
async fn non_existent_run() -> Result<(), ShellError> {
let cmd = ExternalBuilder::for_name("i_dont_exist.exe").build();
let input = InputStream::empty();
let mut ctx = Context::basic().expect("There was a problem creating a basic context.");
assert!(
run_external_command(cmd, &mut ctx, input, &Scope::new(), false)
.await
.is_err()
);
Ok(())
}
// async fn failure_run() -> Result<(), ShellError> {
// let cmd = ExternalBuilder::for_name("fail").build();
// let mut ctx = Context::basic().expect("There was a problem creating a basic context.");
// let stream = run_external_command(cmd, &mut ctx, None, false)
// .await?
// .expect("There was a problem running the external command.");
// match read(stream.into()).await {
// Some(Value {
// value: UntaggedValue::Error(_),
// ..
// }) => {}
// None | _ => panic!("Command didn't fail."),
// }
// Ok(())
// }
// #[test]
// fn identifies_command_failed() -> Result<(), ShellError> {
// block_on(failure_run())
// }
#[test]
fn identifies_command_not_found() -> Result<(), ShellError> {
block_on(non_existent_run())
}
#[test]
fn checks_contains_whitespace_from_argument_to_be_passed_in() {
assert_eq!(argument_contains_whitespace("andrés"), false);
assert_eq!(argument_contains_whitespace("and rés"), true);
assert_eq!(argument_contains_whitespace(r#"and\ rés"#), true);
}
#[test]
fn checks_quotes_from_argument_to_be_passed_in() {
assert_eq!(argument_is_quoted(""), false);
assert_eq!(argument_is_quoted("'"), false);
assert_eq!(argument_is_quoted("'a"), false);
assert_eq!(argument_is_quoted("a"), false);
assert_eq!(argument_is_quoted("a'"), false);
assert_eq!(argument_is_quoted("''"), true);
assert_eq!(argument_is_quoted(r#"""#), false);
assert_eq!(argument_is_quoted(r#""a"#), false);
assert_eq!(argument_is_quoted(r#"a"#), false);
assert_eq!(argument_is_quoted(r#"a""#), false);
assert_eq!(argument_is_quoted(r#""""#), true);
assert_eq!(argument_is_quoted("'andrés"), false);
assert_eq!(argument_is_quoted("andrés'"), false);
assert_eq!(argument_is_quoted(r#""andrés"#), false);
assert_eq!(argument_is_quoted(r#"andrés""#), false);
assert_eq!(argument_is_quoted("'andrés'"), true);
assert_eq!(argument_is_quoted(r#""andrés""#), true);
}
#[test]
fn adds_quotes_to_argument_to_be_passed_in() {
assert_eq!(add_quotes("andrés"), "\"andrés\"");
//assert_eq!(add_quotes("\"andrés\""), "\"andrés\"");
}
#[test]
fn strips_quotes_from_argument_to_be_passed_in() {
assert_eq!(remove_quotes(""), None);
assert_eq!(remove_quotes("'"), None);
assert_eq!(remove_quotes("'a"), None);
assert_eq!(remove_quotes("a"), None);
assert_eq!(remove_quotes("a'"), None);
assert_eq!(remove_quotes("''"), Some(""));
assert_eq!(remove_quotes(r#"""#), None);
assert_eq!(remove_quotes(r#""a"#), None);
assert_eq!(remove_quotes(r#"a"#), None);
assert_eq!(remove_quotes(r#"a""#), None);
assert_eq!(remove_quotes(r#""""#), Some(""));
assert_eq!(remove_quotes("'andrés"), None);
assert_eq!(remove_quotes("andrés'"), None);
assert_eq!(remove_quotes(r#""andrés"#), None);
assert_eq!(remove_quotes(r#"andrés""#), None);
assert_eq!(remove_quotes("'andrés'"), Some("andrés"));
assert_eq!(remove_quotes(r#""andrés""#), Some("andrés"));
}
#[test]
fn expands_tilde_if_starts_with_tilde_character() {
assert_eq!(
expand_tilde("~", || Some(std::path::Path::new("the_path_to_nu_light"))),
"the_path_to_nu_light"
);
}
#[test]
fn does_not_expand_tilde_if_tilde_is_not_first_character() {
assert_eq!(
expand_tilde("1~1", || Some(std::path::Path::new("the_path_to_nu_light"))),
"1~1"
);
}
}

View File

@ -0,0 +1,253 @@
use crate::commands::command::whole_stream_command;
use crate::commands::run_alias::AliasCommand;
use crate::commands::UnevaluatedCallInfo;
use crate::prelude::*;
use log::{log_enabled, trace};
use nu_errors::ShellError;
use nu_protocol::hir::InternalCommand;
use nu_protocol::{CommandAction, Primitive, ReturnSuccess, Scope, UntaggedValue, Value};
pub(crate) async fn run_internal_command(
command: InternalCommand,
context: &mut Context,
input: InputStream,
it: &Value,
vars: &IndexMap<String, Value>,
env: &IndexMap<String, String>,
) -> Result<InputStream, ShellError> {
if log_enabled!(log::Level::Trace) {
trace!(target: "nu::run::internal", "->");
trace!(target: "nu::run::internal", "{}", command.name);
}
let scope = Scope {
it: it.clone(),
vars: vars.clone(),
env: env.clone(),
};
let objects: InputStream = trace_stream!(target: "nu::trace_stream::internal", "input" = input);
let internal_command = context.expect_command(&command.name);
let result = {
context
.run_command(
internal_command?,
Tag::unknown_anchor(command.name_span),
command.args.clone(),
&scope,
objects,
)
.await?
};
let head = Arc::new(command.args.head.clone());
//let context = Arc::new(context.clone());
let context = context.clone();
let command = Arc::new(command);
let scope = Arc::new(scope);
// let scope = scope.clone();
Ok(InputStream::from_stream(
result
.then(move |item| {
let head = head.clone();
let command = command.clone();
let mut context = context.clone();
let scope = scope.clone();
async move {
match item {
Ok(ReturnSuccess::Action(action)) => match action {
CommandAction::ChangePath(path) => {
context.shell_manager.set_path(path);
InputStream::from_stream(futures::stream::iter(vec![]))
}
CommandAction::Exit => std::process::exit(0), // TODO: save history.txt
CommandAction::Error(err) => {
context.error(err.clone());
InputStream::one(UntaggedValue::Error(err).into_untagged_value())
}
CommandAction::AutoConvert(tagged_contents, extension) => {
let contents_tag = tagged_contents.tag.clone();
let command_name = format!("from {}", extension);
let command = command.clone();
if let Some(converter) = context.registry.get_command(&command_name)
{
let new_args = RawCommandArgs {
host: context.host.clone(),
ctrl_c: context.ctrl_c.clone(),
current_errors: context.current_errors.clone(),
shell_manager: context.shell_manager.clone(),
call_info: UnevaluatedCallInfo {
args: nu_protocol::hir::Call {
head: (&*head).clone(),
positional: None,
named: None,
span: Span::unknown(),
is_last: false,
},
name_tag: Tag::unknown_anchor(command.name_span),
scope: (&*scope).clone(),
},
};
let result = converter
.run(
new_args.with_input(vec![tagged_contents]),
&context.registry,
)
.await;
match result {
Ok(mut result) => {
let result_vec: Vec<Result<ReturnSuccess, ShellError>> =
result.drain_vec().await;
let mut output = vec![];
for res in result_vec {
match res {
Ok(ReturnSuccess::Value(Value {
value: UntaggedValue::Table(list),
..
})) => {
for l in list {
output.push(Ok(l));
}
}
Ok(ReturnSuccess::Value(Value {
value,
..
})) => {
output
.push(Ok(value
.into_value(contents_tag.clone())));
}
Err(e) => output.push(Err(e)),
_ => {}
}
}
futures::stream::iter(output).to_input_stream()
}
Err(e) => {
context.add_error(e);
InputStream::empty()
}
}
} else {
InputStream::one(tagged_contents)
}
}
CommandAction::EnterHelpShell(value) => match value {
Value {
value: UntaggedValue::Primitive(Primitive::String(cmd)),
tag,
} => {
context.shell_manager.insert_at_current(Box::new(
match HelpShell::for_command(
UntaggedValue::string(cmd).into_value(tag),
&context.registry(),
) {
Ok(v) => v,
Err(err) => {
return InputStream::one(
UntaggedValue::Error(err).into_untagged_value(),
)
}
},
));
InputStream::from_stream(futures::stream::iter(vec![]))
}
_ => {
context.shell_manager.insert_at_current(Box::new(
match HelpShell::index(&context.registry()) {
Ok(v) => v,
Err(err) => {
return InputStream::one(
UntaggedValue::Error(err).into_untagged_value(),
)
}
},
));
InputStream::from_stream(futures::stream::iter(vec![]))
}
},
CommandAction::EnterValueShell(value) => {
context
.shell_manager
.insert_at_current(Box::new(ValueShell::new(value)));
InputStream::from_stream(futures::stream::iter(vec![]))
}
CommandAction::EnterShell(location) => {
context.shell_manager.insert_at_current(Box::new(
match FilesystemShell::with_location(
location,
context.registry().clone(),
) {
Ok(v) => v,
Err(err) => {
return InputStream::one(
UntaggedValue::Error(err.into())
.into_untagged_value(),
)
}
},
));
InputStream::from_stream(futures::stream::iter(vec![]))
}
CommandAction::AddAlias(name, args, block) => {
context.add_commands(vec![whole_stream_command(
AliasCommand::new(name, args, block),
)]);
InputStream::from_stream(futures::stream::iter(vec![]))
}
CommandAction::PreviousShell => {
context.shell_manager.prev();
InputStream::from_stream(futures::stream::iter(vec![]))
}
CommandAction::NextShell => {
context.shell_manager.next();
InputStream::from_stream(futures::stream::iter(vec![]))
}
CommandAction::LeaveShell => {
context.shell_manager.remove_at_current();
if context.shell_manager.is_empty() {
std::process::exit(0); // TODO: save history.txt
}
InputStream::from_stream(futures::stream::iter(vec![]))
}
},
Ok(ReturnSuccess::Value(Value {
value: UntaggedValue::Error(err),
tag,
})) => {
context.error(err.clone());
InputStream::one(UntaggedValue::Error(err).into_value(tag))
}
Ok(ReturnSuccess::Value(v)) => InputStream::one(v),
Ok(ReturnSuccess::DebugValue(v)) => {
let doc = PrettyDebug::pretty_doc(&v);
let mut buffer = termcolor::Buffer::ansi();
let _ = doc.render_raw(
context.with_host(|host| host.width() - 5),
&mut nu_source::TermColored::new(&mut buffer),
);
let value = String::from_utf8_lossy(buffer.as_slice());
InputStream::one(UntaggedValue::string(value).into_untagged_value())
}
Err(err) => {
context.error(err.clone());
InputStream::one(UntaggedValue::Error(err).into_untagged_value())
}
}
}
})
.flatten()
.take_while(|x| futures::future::ready(!x.is_error())),
))
}

View File

@ -0,0 +1,8 @@
pub(crate) mod block;
mod dynamic;
pub(crate) mod expr;
pub(crate) mod external;
pub(crate) mod internal;
#[allow(unused_imports)]
pub(crate) use dynamic::Command as DynamicCommand;

View File

@ -0,0 +1,57 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::Signature;
use std::process::Command;
pub struct Clear;
#[async_trait]
impl WholeStreamCommand for Clear {
fn name(&self) -> &str {
"clear"
}
fn signature(&self) -> Signature {
Signature::build("clear")
}
fn usage(&self) -> &str {
"clears the terminal"
}
async fn run(&self, _: CommandArgs, _: &CommandRegistry) -> Result<OutputStream, ShellError> {
if cfg!(windows) {
Command::new("cmd")
.args(&["/C", "cls"])
.status()
.expect("failed to execute process");
} else if cfg!(unix) {
Command::new("/bin/sh")
.args(&["-c", "clear"])
.status()
.expect("failed to execute process");
}
Ok(OutputStream::empty())
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Clear the screen",
example: "clear",
result: None,
}]
}
}
#[cfg(test)]
mod tests {
use super::Clear;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Clear {})
}
}

View File

@ -0,0 +1,109 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use futures::stream::StreamExt;
use nu_errors::ShellError;
use nu_protocol::{Signature, Value};
use clipboard::{ClipboardContext, ClipboardProvider};
pub struct Clip;
#[async_trait]
impl WholeStreamCommand for Clip {
fn name(&self) -> &str {
"clip"
}
fn signature(&self) -> Signature {
Signature::build("clip")
}
fn usage(&self) -> &str {
"Copy the contents of the pipeline to the copy/paste buffer"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
clip(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Save text to the clipboard",
example: "echo 'secret value' | clip",
result: None,
}]
}
}
pub async fn clip(
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let input = args.input;
let name = args.call_info.name_tag.clone();
let values: Vec<Value> = input.collect().await;
if let Ok(clip_context) = ClipboardProvider::new() {
let mut clip_context: ClipboardContext = clip_context;
let mut new_copy_data = String::new();
if !values.is_empty() {
let mut first = true;
for i in values.iter() {
if !first {
new_copy_data.push_str("\n");
} else {
first = false;
}
let string: String = match i.as_string() {
Ok(string) => string.to_string(),
Err(_) => {
return Err(ShellError::labeled_error(
"Given non-string data",
"expected strings from pipeline",
name,
))
}
};
new_copy_data.push_str(&string);
}
}
match clip_context.set_contents(new_copy_data) {
Ok(_) => {}
Err(_) => {
return Err(ShellError::labeled_error(
"Could not set contents of clipboard",
"could not set contents of clipboard",
name,
));
}
}
} else {
return Err(ShellError::labeled_error(
"Could not open clipboard",
"could not open clipboard",
name,
));
}
Ok(OutputStream::empty())
}
#[cfg(test)]
mod tests {
use super::Clip;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Clip {})
}
}

View File

@ -0,0 +1,436 @@
use crate::commands::help::get_help;
use crate::context::CommandRegistry;
use crate::deserializer::ConfigDeserializer;
use crate::evaluate::evaluate_args::evaluate_args;
use crate::prelude::*;
use derive_new::new;
use getset::Getters;
use nu_errors::ShellError;
use nu_protocol::hir;
use nu_protocol::{CallInfo, EvaluatedArgs, ReturnSuccess, Scope, Signature, UntaggedValue, Value};
use parking_lot::Mutex;
use serde::{Deserialize, Serialize};
use std::ops::Deref;
use std::sync::atomic::AtomicBool;
#[derive(Deserialize, Serialize, Debug, Clone)]
pub struct UnevaluatedCallInfo {
pub args: hir::Call,
pub name_tag: Tag,
pub scope: Scope,
}
impl UnevaluatedCallInfo {
pub async fn evaluate(self, registry: &CommandRegistry) -> Result<CallInfo, ShellError> {
let args = evaluate_args(&self.args, registry, &self.scope).await?;
Ok(CallInfo {
args,
name_tag: self.name_tag,
})
}
pub async fn evaluate_with_new_it(
self,
registry: &CommandRegistry,
it: &Value,
) -> Result<CallInfo, ShellError> {
let mut scope = self.scope.clone();
scope.it = it.clone();
let args = evaluate_args(&self.args, registry, &scope).await?;
Ok(CallInfo {
args,
name_tag: self.name_tag,
})
}
pub fn switch_present(&self, switch: &str) -> bool {
self.args.switch_preset(switch)
}
}
#[derive(Getters)]
#[get = "pub(crate)"]
pub struct CommandArgs {
pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>,
pub ctrl_c: Arc<AtomicBool>,
pub current_errors: Arc<Mutex<Vec<ShellError>>>,
pub shell_manager: ShellManager,
pub call_info: UnevaluatedCallInfo,
pub input: InputStream,
pub raw_input: String,
}
#[derive(Getters, Clone)]
#[get = "pub(crate)"]
pub struct RawCommandArgs {
pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>,
pub ctrl_c: Arc<AtomicBool>,
pub current_errors: Arc<Mutex<Vec<ShellError>>>,
pub shell_manager: ShellManager,
pub call_info: UnevaluatedCallInfo,
}
impl RawCommandArgs {
pub fn with_input(self, input: impl Into<InputStream>) -> CommandArgs {
CommandArgs {
host: self.host,
ctrl_c: self.ctrl_c,
current_errors: self.current_errors,
shell_manager: self.shell_manager,
call_info: self.call_info,
input: input.into(),
raw_input: String::default(),
}
}
}
impl std::fmt::Debug for CommandArgs {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
self.call_info.fmt(f)
}
}
impl CommandArgs {
pub async fn evaluate_once(
self,
registry: &CommandRegistry,
) -> Result<EvaluatedWholeStreamCommandArgs, ShellError> {
let host = self.host.clone();
let ctrl_c = self.ctrl_c.clone();
let shell_manager = self.shell_manager.clone();
let input = self.input;
let call_info = self.call_info.evaluate(registry).await?;
Ok(EvaluatedWholeStreamCommandArgs::new(
host,
ctrl_c,
shell_manager,
call_info,
input,
))
}
pub async fn evaluate_once_with_scope(
self,
registry: &CommandRegistry,
scope: &Scope,
) -> Result<EvaluatedWholeStreamCommandArgs, ShellError> {
let host = self.host.clone();
let ctrl_c = self.ctrl_c.clone();
let shell_manager = self.shell_manager.clone();
let input = self.input;
let call_info = UnevaluatedCallInfo {
name_tag: self.call_info.name_tag,
args: self.call_info.args,
scope: scope.clone(),
};
let call_info = call_info.evaluate(registry).await?;
Ok(EvaluatedWholeStreamCommandArgs::new(
host,
ctrl_c,
shell_manager,
call_info,
input,
))
}
pub async fn process<'de, T: Deserialize<'de>>(
self,
registry: &CommandRegistry,
) -> Result<(T, InputStream), ShellError> {
let args = self.evaluate_once(registry).await?;
let call_info = args.call_info.clone();
let mut deserializer = ConfigDeserializer::from_call_info(call_info);
Ok((T::deserialize(&mut deserializer)?, args.input))
}
}
pub struct RunnableContext {
pub input: InputStream,
pub shell_manager: ShellManager,
pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>,
pub ctrl_c: Arc<AtomicBool>,
pub current_errors: Arc<Mutex<Vec<ShellError>>>,
pub registry: CommandRegistry,
pub name: Tag,
pub raw_input: String,
}
impl RunnableContext {
pub fn get_command(&self, name: &str) -> Option<Command> {
self.registry.get_command(name)
}
}
pub struct EvaluatedWholeStreamCommandArgs {
pub args: EvaluatedCommandArgs,
pub input: InputStream,
}
impl Deref for EvaluatedWholeStreamCommandArgs {
type Target = EvaluatedCommandArgs;
fn deref(&self) -> &Self::Target {
&self.args
}
}
impl EvaluatedWholeStreamCommandArgs {
pub fn new(
host: Arc<parking_lot::Mutex<dyn Host>>,
ctrl_c: Arc<AtomicBool>,
shell_manager: ShellManager,
call_info: CallInfo,
input: impl Into<InputStream>,
) -> EvaluatedWholeStreamCommandArgs {
EvaluatedWholeStreamCommandArgs {
args: EvaluatedCommandArgs {
host,
ctrl_c,
shell_manager,
call_info,
},
input: input.into(),
}
}
pub fn name_tag(&self) -> Tag {
self.args.call_info.name_tag.clone()
}
pub fn parts(self) -> (InputStream, EvaluatedArgs) {
let EvaluatedWholeStreamCommandArgs { args, input } = self;
(input, args.call_info.args)
}
pub fn split(self) -> (InputStream, EvaluatedCommandArgs) {
let EvaluatedWholeStreamCommandArgs { args, input } = self;
(input, args)
}
}
#[derive(Getters)]
#[get = "pub"]
pub struct EvaluatedFilterCommandArgs {
args: EvaluatedCommandArgs,
}
impl Deref for EvaluatedFilterCommandArgs {
type Target = EvaluatedCommandArgs;
fn deref(&self) -> &Self::Target {
&self.args
}
}
impl EvaluatedFilterCommandArgs {
pub fn new(
host: Arc<parking_lot::Mutex<dyn Host>>,
ctrl_c: Arc<AtomicBool>,
shell_manager: ShellManager,
call_info: CallInfo,
) -> EvaluatedFilterCommandArgs {
EvaluatedFilterCommandArgs {
args: EvaluatedCommandArgs {
host,
ctrl_c,
shell_manager,
call_info,
},
}
}
}
#[derive(Getters, new)]
#[get = "pub(crate)"]
pub struct EvaluatedCommandArgs {
pub host: Arc<parking_lot::Mutex<dyn Host>>,
pub ctrl_c: Arc<AtomicBool>,
pub shell_manager: ShellManager,
pub call_info: CallInfo,
}
impl EvaluatedCommandArgs {
pub fn nth(&self, pos: usize) -> Option<&Value> {
self.call_info.args.nth(pos)
}
/// Get the nth positional argument, error if not possible
pub fn expect_nth(&self, pos: usize) -> Result<&Value, ShellError> {
match self.call_info.args.nth(pos) {
None => Err(ShellError::unimplemented("Better error: expect_nth")),
Some(item) => Ok(item),
}
}
pub fn get(&self, name: &str) -> Option<&Value> {
self.call_info.args.get(name)
}
pub fn has(&self, name: &str) -> bool {
self.call_info.args.has(name)
}
}
pub struct Example {
pub example: &'static str,
pub description: &'static str,
pub result: Option<Vec<Value>>,
}
#[async_trait]
pub trait WholeStreamCommand: Send + Sync {
fn name(&self) -> &str;
fn signature(&self) -> Signature {
Signature::new(self.name()).desc(self.usage()).filter()
}
fn usage(&self) -> &str;
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError>;
fn is_binary(&self) -> bool {
false
}
fn examples(&self) -> Vec<Example> {
Vec::new()
}
}
#[derive(Clone)]
pub struct Command(Arc<dyn WholeStreamCommand>);
impl PrettyDebugWithSource for Command {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed(
"whole stream command",
b::description(self.name())
+ b::space()
+ b::equals()
+ b::space()
+ self.signature().pretty_debug(source),
)
}
}
impl std::fmt::Debug for Command {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "Command({})", self.name())
}
}
impl Command {
pub fn name(&self) -> &str {
self.0.name()
}
pub fn signature(&self) -> Signature {
self.0.signature()
}
pub fn usage(&self) -> &str {
self.0.usage()
}
pub async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
if args.call_info.switch_present("help") {
let cl = self.0.clone();
let registry = registry.clone();
Ok(OutputStream::one(Ok(ReturnSuccess::Value(
UntaggedValue::string(get_help(&*cl, &registry)).into_value(Tag::unknown()),
))))
} else {
self.0.run(args, registry).await
}
}
pub fn is_binary(&self) -> bool {
self.0.is_binary()
}
pub fn stream_command(&self) -> &dyn WholeStreamCommand {
&*self.0
}
}
pub struct FnFilterCommand {
name: String,
func: fn(EvaluatedFilterCommandArgs) -> Result<OutputStream, ShellError>,
}
#[async_trait]
impl WholeStreamCommand for FnFilterCommand {
fn name(&self) -> &str {
&self.name
}
fn usage(&self) -> &str {
"usage"
}
async fn run(
&self,
CommandArgs {
host,
ctrl_c,
shell_manager,
call_info,
input,
..
}: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = Arc::new(registry.clone());
let func = self.func;
Ok(input
.then(move |it| {
let host = host.clone();
let registry = registry.clone();
let ctrl_c = ctrl_c.clone();
let shell_manager = shell_manager.clone();
let call_info = call_info.clone();
async move {
let call_info = match call_info.evaluate_with_new_it(&*registry, &it).await {
Err(err) => {
return OutputStream::one(Err(err));
}
Ok(args) => args,
};
let args = EvaluatedFilterCommandArgs::new(
host.clone(),
ctrl_c.clone(),
shell_manager.clone(),
call_info,
);
match func(args) {
Err(err) => return OutputStream::one(Err(err)),
Ok(stream) => stream,
}
}
})
.flatten()
.to_output_stream())
}
}
pub fn whole_stream_command(command: impl WholeStreamCommand + 'static) -> Command {
Command(Arc::new(command))
}

View File

@ -0,0 +1,105 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use futures::future;
use futures::stream::StreamExt;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct Compact;
#[derive(Deserialize)]
pub struct CompactArgs {
rest: Vec<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for Compact {
fn name(&self) -> &str {
"compact"
}
fn signature(&self) -> Signature {
Signature::build("compact").rest(SyntaxShape::Any, "the columns to compact from the table")
}
fn usage(&self) -> &str {
"Creates a table with non-empty rows"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
compact(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Filter out all null entries in a list",
example: "echo [1 2 $null 3 $null $null] | compact target",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(2).into(),
UntaggedValue::int(3).into(),
]),
},
Example {
description: "Filter out all directory entries having no 'target'",
example: "ls -af | compact target",
result: None,
},
]
}
}
pub async fn compact(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (CompactArgs { rest: columns }, input) = args.process(&registry).await?;
Ok(input
.filter_map(move |item| {
future::ready(if columns.is_empty() {
if !item.is_empty() {
Some(ReturnSuccess::value(item))
} else {
None
}
} else {
match item {
Value {
value: UntaggedValue::Row(ref r),
..
} => {
if columns
.iter()
.all(|field| r.get_data(field).borrow().is_some())
{
Some(ReturnSuccess::value(item))
} else {
None
}
}
_ => None,
}
})
})
.to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Compact;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Compact {})
}
}

View File

@ -0,0 +1,260 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::data::config;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use std::path::PathBuf;
pub struct Config;
#[derive(Deserialize)]
pub struct ConfigArgs {
load: Option<Tagged<PathBuf>>,
set: Option<(Tagged<String>, Value)>,
set_into: Option<Tagged<String>>,
get: Option<Tagged<String>>,
clear: Tagged<bool>,
remove: Option<Tagged<String>>,
path: Tagged<bool>,
}
#[async_trait]
impl WholeStreamCommand for Config {
fn name(&self) -> &str {
"config"
}
fn signature(&self) -> Signature {
Signature::build("config")
.named(
"load",
SyntaxShape::Path,
"load the config from the path given",
Some('l'),
)
.named(
"set",
SyntaxShape::Any,
"set a value in the config, eg) --set [key value]",
Some('s'),
)
.named(
"set_into",
SyntaxShape::String,
"sets a variable from values in the pipeline",
Some('i'),
)
.named(
"get",
SyntaxShape::Any,
"get a value from the config",
Some('g'),
)
.named(
"remove",
SyntaxShape::Any,
"remove a value from the config",
Some('r'),
)
.switch("clear", "clear the config", Some('c'))
.switch("path", "return the path to the config file", Some('p'))
}
fn usage(&self) -> &str {
"Configuration management."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
config(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "See all config values",
example: "config",
result: None,
},
Example {
description: "Set completion_mode to circular",
example: "config --set [completion_mode circular]",
result: None,
},
Example {
description: "Store the contents of the pipeline as a path",
example: "echo ['/usr/bin' '/bin'] | config --set_into path",
result: None,
},
Example {
description: "Get the current startup commands",
example: "config --get startup",
result: None,
},
Example {
description: "Remove the startup commands",
example: "config --remove startup",
result: None,
},
Example {
description: "Clear the config (be careful!)",
example: "config --clear",
result: None,
},
Example {
description: "Get the path to the current config file",
example: "config --path",
result: None,
},
]
}
}
pub async fn config(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let name = args.call_info.name_tag.clone();
let registry = registry.clone();
let (
ConfigArgs {
load,
set,
set_into,
get,
clear,
remove,
path,
},
input,
) = args.process(&registry).await?;
let configuration = if let Some(supplied) = load {
Some(supplied.item().clone())
} else {
None
};
let mut result = crate::data::config::read(name_span, &configuration)?;
Ok(if let Some(v) = get {
let key = v.to_string();
let value = result
.get(&key)
.ok_or_else(|| ShellError::labeled_error("Missing key in config", "key", v.tag()))?;
match value {
Value {
value: UntaggedValue::Table(list),
..
} => {
let list: Vec<_> = list
.iter()
.map(|x| ReturnSuccess::value(x.clone()))
.collect();
futures::stream::iter(list).to_output_stream()
}
x => {
let x = x.clone();
OutputStream::one(ReturnSuccess::value(x))
}
}
} else if let Some((key, value)) = set {
result.insert(key.to_string(), value.clone());
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(&value.tag),
))
} else if let Some(v) = set_into {
let rows: Vec<Value> = input.collect().await;
let key = v.to_string();
if rows.is_empty() {
return Err(ShellError::labeled_error(
"No values given for set_into",
"needs value(s) from pipeline",
v.tag(),
));
} else if rows.len() == 1 {
// A single value
let value = &rows[0];
result.insert(key, value.clone());
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
))
} else {
// Take in the pipeline as a table
let value = UntaggedValue::Table(rows).into_value(name.clone());
result.insert(key, value);
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
))
}
} else if let Tagged { item: true, tag } = clear {
result.clear();
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(tag),
))
} else if let Tagged { item: true, tag } = path {
let path = config::default_path_for(&configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Primitive(Primitive::Path(path)).into_value(tag),
))
} else if let Some(v) = remove {
let key = v.to_string();
if result.contains_key(&key) {
result.swap_remove(&key);
config::write(&result, &configuration)?;
futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(v.tag()),
)])
.to_output_stream()
} else {
return Err(ShellError::labeled_error(
"Key does not exist in config",
"key",
v.tag(),
));
}
} else {
futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
)])
.to_output_stream()
})
}
#[cfg(test)]
mod tests {
use super::Config;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Config {})
}
}

View File

@ -0,0 +1,56 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use futures::stream::StreamExt;
use nu_errors::ShellError;
use nu_protocol::{Signature, UntaggedValue, Value};
pub struct Count;
#[async_trait]
impl WholeStreamCommand for Count {
fn name(&self) -> &str {
"count"
}
fn signature(&self) -> Signature {
Signature::build("count")
}
fn usage(&self) -> &str {
"Show the total number of rows or items."
}
async fn run(
&self,
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = args.call_info.name_tag.clone();
let rows: Vec<Value> = args.input.collect().await;
Ok(OutputStream::one(
UntaggedValue::int(rows.len()).into_value(name),
))
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Count the number of entries in a list",
example: "echo [1 2 3 4 5] | count",
result: Some(vec![UntaggedValue::int(5).into()]),
}]
}
}
#[cfg(test)]
mod tests {
use super::Count;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Count {})
}
}

View File

@ -0,0 +1,76 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape};
use nu_source::Tagged;
use std::path::PathBuf;
pub struct Cpy;
#[derive(Deserialize)]
pub struct CopyArgs {
pub src: Tagged<PathBuf>,
pub dst: Tagged<PathBuf>,
pub recursive: Tagged<bool>,
}
#[async_trait]
impl WholeStreamCommand for Cpy {
fn name(&self) -> &str {
"cp"
}
fn signature(&self) -> Signature {
Signature::build("cp")
.required("src", SyntaxShape::Pattern, "the place to copy from")
.required("dst", SyntaxShape::Path, "the place to copy to")
.switch(
"recursive",
"copy recursively through subdirectories",
Some('r'),
)
}
fn usage(&self) -> &str {
"Copy files."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let shell_manager = args.shell_manager.clone();
let name = args.call_info.name_tag.clone();
let (args, _) = args.process(&registry).await?;
shell_manager.cp(args, name)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Copy myfile to dir_b",
example: "cp myfile dir_b",
result: None,
},
Example {
description: "Recursively copy dir_a to dir_b",
example: "cp -r dir_a dir_b",
result: None,
},
]
}
}
#[cfg(test)]
mod tests {
use super::Cpy;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Cpy {})
}
}

View File

@ -0,0 +1,175 @@
use crate::prelude::*;
use chrono::{DateTime, Local, Utc};
use nu_errors::ShellError;
use nu_protocol::{Dictionary, Value};
use crate::commands::WholeStreamCommand;
use chrono::{Datelike, TimeZone, Timelike};
use core::fmt::Display;
use indexmap::IndexMap;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue};
pub struct Date;
#[async_trait]
impl WholeStreamCommand for Date {
fn name(&self) -> &str {
"date"
}
fn signature(&self) -> Signature {
Signature::build("date")
.switch("utc", "use universal time (UTC)", Some('u'))
.switch("local", "use the local time", Some('l'))
.named(
"format",
SyntaxShape::String,
"report datetime in supplied strftime format",
Some('f'),
)
.switch("raw", "print date without tables", Some('r'))
}
fn usage(&self) -> &str {
"Get the current datetime."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
date(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Get the current local time and date",
example: "date",
result: None,
},
Example {
description: "Get the current UTC time and date",
example: "date --utc",
result: None,
},
Example {
description: "Get the current time and date and report it based on format",
example: "date --format '%Y-%m-%d %H:%M:%S.%f %z'",
result: None,
},
Example {
description: "Get the current time and date and report it without a table",
example: "date --format '%Y-%m-%d %H:%M:%S.%f %z' --raw",
result: None,
},
]
}
}
pub fn date_to_value_raw<T: TimeZone>(dt: DateTime<T>, dt_format: String) -> String
where
T::Offset: Display,
{
let result = dt.format(&dt_format);
format!("{}", result)
}
pub fn date_to_value<T: TimeZone>(dt: DateTime<T>, tag: Tag, dt_format: String) -> Value
where
T::Offset: Display,
{
let mut indexmap = IndexMap::new();
if dt_format.is_empty() {
indexmap.insert(
"year".to_string(),
UntaggedValue::int(dt.year()).into_value(&tag),
);
indexmap.insert(
"month".to_string(),
UntaggedValue::int(dt.month()).into_value(&tag),
);
indexmap.insert(
"day".to_string(),
UntaggedValue::int(dt.day()).into_value(&tag),
);
indexmap.insert(
"hour".to_string(),
UntaggedValue::int(dt.hour()).into_value(&tag),
);
indexmap.insert(
"minute".to_string(),
UntaggedValue::int(dt.minute()).into_value(&tag),
);
indexmap.insert(
"second".to_string(),
UntaggedValue::int(dt.second()).into_value(&tag),
);
let tz = dt.offset();
indexmap.insert(
"timezone".to_string(),
UntaggedValue::string(format!("{}", tz)).into_value(&tag),
);
} else {
let result = dt.format(&dt_format);
indexmap.insert(
"formatted".to_string(),
UntaggedValue::string(format!("{}", result)).into_value(&tag),
);
}
UntaggedValue::Row(Dictionary::from(indexmap)).into_value(&tag)
}
pub async fn date(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.call_info.name_tag.clone();
let raw = args.has("raw");
let dt_fmt = if args.has("format") {
if let Some(dt_fmt) = args.get("format") {
dt_fmt.convert_to_string()
} else {
"".to_string()
}
} else {
"".to_string()
};
let value = if args.has("utc") {
let utc: DateTime<Utc> = Utc::now();
if raw {
UntaggedValue::string(date_to_value_raw(utc, dt_fmt)).into_untagged_value()
} else {
date_to_value(utc, tag, dt_fmt)
}
} else {
let local: DateTime<Local> = Local::now();
if raw {
UntaggedValue::string(date_to_value_raw(local, dt_fmt)).into_untagged_value()
} else {
date_to_value(local, tag, dt_fmt)
}
};
Ok(OutputStream::one(value))
}
#[cfg(test)]
mod tests {
use super::Date;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Date {})
}
}

View File

@ -0,0 +1,65 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
pub struct Debug;
#[derive(Deserialize)]
pub struct DebugArgs {
raw: bool,
}
#[async_trait]
impl WholeStreamCommand for Debug {
fn name(&self) -> &str {
"debug"
}
fn signature(&self) -> Signature {
Signature::build("debug").switch("raw", "Prints the raw value representation.", Some('r'))
}
fn usage(&self) -> &str {
"Print the Rust debug representation of the values"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
debug_value(args, registry).await
}
}
async fn debug_value(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (DebugArgs { raw }, input) = args.process(&registry).await?;
Ok(input
.map(move |v| {
if raw {
ReturnSuccess::value(
UntaggedValue::string(format!("{:#?}", v)).into_untagged_value(),
)
} else {
ReturnSuccess::debug_value(v)
}
})
.to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Debug;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Debug {})
}
}

View File

@ -0,0 +1,93 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use nu_value_ext::ValueExt;
#[derive(Deserialize)]
struct DefaultArgs {
column: Tagged<String>,
value: Value,
}
pub struct Default;
#[async_trait]
impl WholeStreamCommand for Default {
fn name(&self) -> &str {
"default"
}
fn signature(&self) -> Signature {
Signature::build("default")
.required("column name", SyntaxShape::String, "the name of the column")
.required(
"column value",
SyntaxShape::Any,
"the value of the column to default",
)
}
fn usage(&self) -> &str {
"Sets a default row's column if missing."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
default(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Give a default 'target' to all file entries",
example: "ls -af | default target 'nothing'",
result: None,
}]
}
}
async fn default(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (DefaultArgs { column, value }, input) = args.process(&registry).await?;
Ok(input
.map(move |item| {
let should_add = match item {
Value {
value: UntaggedValue::Row(ref r),
..
} => r.get_data(&column.item).borrow().is_none(),
_ => false,
};
if should_add {
match item.insert_data_at_path(&column.item, value.clone()) {
Some(new_value) => ReturnSuccess::value(new_value),
None => ReturnSuccess::value(item),
}
} else {
ReturnSuccess::value(item)
}
})
.to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Default;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Default {})
}
}

View File

@ -0,0 +1,111 @@
use crate::commands::classified::block::run_block;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{hir::Block, ReturnSuccess, Signature, SyntaxShape, Value};
pub struct Do;
#[derive(Deserialize, Debug)]
struct DoArgs {
block: Block,
ignore_errors: bool,
}
#[async_trait]
impl WholeStreamCommand for Do {
fn name(&self) -> &str {
"do"
}
fn signature(&self) -> Signature {
Signature::build("with-env")
.required("block", SyntaxShape::Block, "the block to run ")
.switch(
"ignore_errors",
"ignore errors as the block runs",
Some('i'),
)
}
fn usage(&self) -> &str {
"Runs a block, optionally ignoring errors"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
do_(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Run the block",
example: r#"do { echo hello }"#,
result: Some(vec![Value::from("hello")]),
},
Example {
description: "Run the block and ignore errors",
example: r#"do -i { thisisnotarealcommand }"#,
result: Some(vec![Value::nothing()]),
},
]
}
}
async fn do_(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let is_last = raw_args.call_info.args.is_last;
let mut context = Context::from_raw(&raw_args, &registry);
let scope = raw_args.call_info.scope.clone();
let (
DoArgs {
ignore_errors,
mut block,
},
input,
) = raw_args.process(&registry).await?;
block.set_is_last(is_last);
let result = run_block(
&block,
&mut context,
input,
&scope.it,
&scope.vars,
&scope.env,
)
.await;
if ignore_errors {
match result {
Ok(mut stream) => {
let output = stream.drain_vec().await;
context.clear_errors();
Ok(futures::stream::iter(output).to_output_stream())
}
Err(_) => Ok(OutputStream::one(ReturnSuccess::value(Value::nothing()))),
}
} else {
result.map(|x| x.to_output_stream())
}
}
#[cfg(test)]
mod tests {
use super::Do;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Do {})
}
}

View File

@ -0,0 +1,83 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
pub struct Drop;
#[derive(Deserialize)]
pub struct DropArgs {
rows: Option<Tagged<u64>>,
}
#[async_trait]
impl WholeStreamCommand for Drop {
fn name(&self) -> &str {
"drop"
}
fn signature(&self) -> Signature {
Signature::build("drop").optional(
"rows",
SyntaxShape::Number,
"starting from the back, the number of rows to drop",
)
}
fn usage(&self) -> &str {
"Drop the last number of rows."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let (DropArgs { rows }, input) = args.process(&registry).await?;
let mut v: Vec<_> = input.into_vec().await;
let rows_to_drop = if let Some(quantity) = rows {
*quantity as usize
} else {
1
};
for _ in 0..rows_to_drop {
v.pop();
}
Ok(futures::stream::iter(v).to_output_stream())
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Remove the last item of a list/table",
example: "echo [1 2 3] | drop",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(2).into(),
]),
},
Example {
description: "Remove the last 2 items of a list/table",
example: "echo [1 2 3] | drop 2",
result: Some(vec![UntaggedValue::int(1).into()]),
},
]
}
}
#[cfg(test)]
mod tests {
use super::Drop;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Drop {})
}
}

View File

@ -0,0 +1,434 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use filesize::file_real_size_fast;
use glob::*;
use indexmap::map::IndexMap;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use std::path::PathBuf;
use std::sync::atomic::Ordering;
const NAME: &str = "du";
const GLOB_PARAMS: MatchOptions = MatchOptions {
case_sensitive: true,
require_literal_separator: true,
require_literal_leading_dot: false,
};
pub struct Du;
#[derive(Deserialize, Clone)]
pub struct DuArgs {
path: Option<Tagged<PathBuf>>,
all: bool,
deref: bool,
exclude: Option<Tagged<String>>,
#[serde(rename = "max-depth")]
max_depth: Option<Tagged<u64>>,
#[serde(rename = "min-size")]
min_size: Option<Tagged<u64>>,
}
#[async_trait]
impl WholeStreamCommand for Du {
fn name(&self) -> &str {
NAME
}
fn signature(&self) -> Signature {
Signature::build(NAME)
.optional("path", SyntaxShape::Pattern, "starting directory")
.switch(
"all",
"Output file sizes as well as directory sizes",
Some('a'),
)
.switch(
"deref",
"Dereference symlinks to their targets for size",
Some('r'),
)
.named(
"exclude",
SyntaxShape::Pattern,
"Exclude these file names",
Some('x'),
)
.named(
"max-depth",
SyntaxShape::Int,
"Directory recursion limit",
Some('d'),
)
.named(
"min-size",
SyntaxShape::Int,
"Exclude files below this size",
Some('m'),
)
}
fn usage(&self) -> &str {
"Find disk usage sizes of specified items"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
du(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Disk usage of the current directory",
example: "du",
result: None,
}]
}
}
async fn du(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let tag = args.call_info.name_tag.clone();
let ctrl_c = args.ctrl_c.clone();
let ctrl_c_copy = ctrl_c.clone();
let (args, _): (DuArgs, _) = args.process(&registry).await?;
let exclude = args.exclude.map_or(Ok(None), move |x| {
Pattern::new(&x.item)
.map(Option::Some)
.map_err(|e| ShellError::labeled_error(e.msg, "glob error", x.tag.clone()))
})?;
let include_files = args.all;
let paths = match args.path {
Some(p) => {
let p = p.item.to_str().expect("Why isn't this encoded properly?");
glob::glob_with(p, GLOB_PARAMS)
}
None => glob::glob_with("*", GLOB_PARAMS),
}
.map_err(|e| ShellError::labeled_error(e.msg, "glob error", tag.clone()))?
.filter(move |p| {
if include_files {
true
} else {
match p {
Ok(f) if f.is_dir() => true,
Err(e) if e.path().is_dir() => true,
_ => false,
}
}
})
.map(|v| v.map_err(glob_err_into));
let all = args.all;
let deref = args.deref;
let max_depth = args.max_depth.map(|f| f.item);
let min_size = args.min_size.map(|f| f.item);
let params = DirBuilder {
tag: tag.clone(),
min: min_size,
deref,
exclude,
all,
};
let inp = futures::stream::iter(paths);
Ok(inp
.flat_map(move |path| match path {
Ok(p) => {
let mut output = vec![];
if p.is_dir() {
output.push(Ok(ReturnSuccess::Value(
DirInfo::new(p, &params, max_depth, ctrl_c.clone()).into(),
)));
} else {
for v in FileInfo::new(p, deref, tag.clone()).into_iter() {
output.push(Ok(ReturnSuccess::Value(v.into())));
}
}
futures::stream::iter(output)
}
Err(e) => futures::stream::iter(vec![Err(e)]),
})
.interruptible(ctrl_c_copy)
.to_output_stream())
}
pub struct DirBuilder {
tag: Tag,
min: Option<u64>,
deref: bool,
exclude: Option<Pattern>,
all: bool,
}
impl DirBuilder {
pub fn new(
tag: Tag,
min: Option<u64>,
deref: bool,
exclude: Option<Pattern>,
all: bool,
) -> DirBuilder {
DirBuilder {
tag,
min,
deref,
exclude,
all,
}
}
}
pub struct DirInfo {
dirs: Vec<DirInfo>,
files: Vec<FileInfo>,
errors: Vec<ShellError>,
size: u64,
blocks: u64,
path: PathBuf,
tag: Tag,
}
struct FileInfo {
path: PathBuf,
size: u64,
blocks: Option<u64>,
tag: Tag,
}
impl FileInfo {
fn new(path: impl Into<PathBuf>, deref: bool, tag: Tag) -> Result<Self, ShellError> {
let path = path.into();
let m = if deref {
std::fs::metadata(&path)
} else {
std::fs::symlink_metadata(&path)
};
match m {
Ok(d) => {
let block_size = file_real_size_fast(&path, &d).ok();
Ok(FileInfo {
path,
blocks: block_size,
size: d.len(),
tag,
})
}
Err(e) => Err(e.into()),
}
}
}
impl DirInfo {
pub fn new(
path: impl Into<PathBuf>,
params: &DirBuilder,
depth: Option<u64>,
ctrl_c: Arc<AtomicBool>,
) -> Self {
let path = path.into();
let mut s = Self {
dirs: Vec::new(),
errors: Vec::new(),
files: Vec::new(),
size: 0,
blocks: 0,
tag: params.tag.clone(),
path,
};
match std::fs::read_dir(&s.path) {
Ok(d) => {
for f in d {
if ctrl_c.load(Ordering::SeqCst) {
break;
}
match f {
Ok(i) => match i.file_type() {
Ok(t) if t.is_dir() => {
s = s.add_dir(i.path(), depth, &params, ctrl_c.clone())
}
Ok(_t) => s = s.add_file(i.path(), &params),
Err(e) => s = s.add_error(e.into()),
},
Err(e) => s = s.add_error(e.into()),
}
}
}
Err(e) => s = s.add_error(e.into()),
}
s
}
fn add_dir(
mut self,
path: impl Into<PathBuf>,
mut depth: Option<u64>,
params: &DirBuilder,
ctrl_c: Arc<AtomicBool>,
) -> Self {
if let Some(current) = depth {
if let Some(new) = current.checked_sub(1) {
depth = Some(new);
} else {
return self;
}
}
let d = DirInfo::new(path, &params, depth, ctrl_c);
self.size += d.size;
self.blocks += d.blocks;
self.dirs.push(d);
self
}
fn add_file(mut self, f: impl Into<PathBuf>, params: &DirBuilder) -> Self {
let f = f.into();
let include = params
.exclude
.as_ref()
.map_or(true, |x| !x.matches_path(&f));
if include {
match FileInfo::new(f, params.deref, self.tag.clone()) {
Ok(file) => {
let inc = params.min.map_or(true, |s| file.size >= s);
if inc {
self.size += file.size;
self.blocks += file.blocks.unwrap_or(0);
if params.all {
self.files.push(file);
}
}
}
Err(e) => self = self.add_error(e),
}
}
self
}
fn add_error(mut self, e: ShellError) -> Self {
self.errors.push(e);
self
}
pub fn get_size(&self) -> u64 {
self.size
}
}
fn glob_err_into(e: GlobError) -> ShellError {
let e = e.into_error();
ShellError::from(e)
}
fn value_from_vec<V>(vec: Vec<V>, tag: &Tag) -> Value
where
V: Into<Value>,
{
if vec.is_empty() {
UntaggedValue::nothing()
} else {
let values = vec.into_iter().map(Into::into).collect::<Vec<Value>>();
UntaggedValue::Table(values)
}
.retag(tag)
}
impl From<DirInfo> for Value {
fn from(d: DirInfo) -> Self {
let mut r: IndexMap<String, Value> = IndexMap::new();
r.insert(
"path".to_string(),
UntaggedValue::path(d.path).retag(&d.tag),
);
r.insert(
"apparent".to_string(),
UntaggedValue::bytes(d.size).retag(&d.tag),
);
r.insert(
"physical".to_string(),
UntaggedValue::bytes(d.blocks).retag(&d.tag),
);
r.insert("directories".to_string(), value_from_vec(d.dirs, &d.tag));
r.insert("files".to_string(), value_from_vec(d.files, &d.tag));
if !d.errors.is_empty() {
let v = UntaggedValue::Table(
d.errors
.into_iter()
.map(move |e| UntaggedValue::Error(e).into_untagged_value())
.collect::<Vec<Value>>(),
)
.retag(&d.tag);
r.insert("errors".to_string(), v);
}
Value {
value: UntaggedValue::row(r),
tag: d.tag,
}
}
}
impl From<FileInfo> for Value {
fn from(f: FileInfo) -> Self {
let mut r: IndexMap<String, Value> = IndexMap::new();
r.insert(
"path".to_string(),
UntaggedValue::path(f.path).retag(&f.tag),
);
r.insert(
"apparent".to_string(),
UntaggedValue::bytes(f.size).retag(&f.tag),
);
let b = f
.blocks
.map(UntaggedValue::bytes)
.unwrap_or_else(UntaggedValue::nothing)
.retag(&f.tag);
r.insert("physical".to_string(), b);
r.insert(
"directories".to_string(),
UntaggedValue::nothing().retag(&f.tag),
);
r.insert("files".to_string(), UntaggedValue::nothing().retag(&f.tag));
UntaggedValue::row(r).retag(&f.tag)
}
}
#[cfg(test)]
mod tests {
use super::Du;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Du {})
}
}

View File

@ -0,0 +1,141 @@
use crate::commands::classified::block::run_block;
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use futures::stream::once;
use nu_errors::ShellError;
use nu_protocol::{
hir::Block, hir::Expression, hir::SpannedExpression, hir::Synthetic, Scope, Signature,
SyntaxShape, UntaggedValue, Value,
};
pub struct Each;
#[derive(Deserialize)]
pub struct EachArgs {
block: Block,
}
#[async_trait]
impl WholeStreamCommand for Each {
fn name(&self) -> &str {
"each"
}
fn signature(&self) -> Signature {
Signature::build("each").required(
"block",
SyntaxShape::Block,
"the block to run on each row",
)
}
fn usage(&self) -> &str {
"Run a block on each row of the table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
each(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Echo the square of each integer",
example: "echo [1 2 3] | each { echo $(= $it * $it) }",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(4).into(),
UntaggedValue::int(9).into(),
]),
},
Example {
description: "Echo the sum of each row",
example: "echo [[1 2] [3 4]] | each { echo $it | math sum }",
result: Some(vec![
UntaggedValue::int(3).into(),
UntaggedValue::int(7).into(),
]),
},
]
}
}
fn is_expanded_it_usage(head: &SpannedExpression) -> bool {
match &*head {
SpannedExpression {
expr: Expression::Synthetic(Synthetic::String(s)),
..
} if s == "expanded-each" => true,
_ => false,
}
}
async fn process_row(
block: Arc<Block>,
scope: Arc<Scope>,
head: Arc<Box<SpannedExpression>>,
mut context: Arc<Context>,
input: Value,
) -> Result<OutputStream, ShellError> {
let input_clone = input.clone();
let input_stream = if is_expanded_it_usage(&head) {
InputStream::empty()
} else {
once(async { Ok(input_clone) }).to_input_stream()
};
Ok(run_block(
&block,
Arc::make_mut(&mut context),
input_stream,
&input,
&scope.vars,
&scope.env,
)
.await?
.to_output_stream())
}
async fn each(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let head = Arc::new(raw_args.call_info.args.head.clone());
let scope = Arc::new(raw_args.call_info.scope.clone());
let context = Arc::new(Context::from_raw(&raw_args, &registry));
let (each_args, input): (EachArgs, _) = raw_args.process(&registry).await?;
let block = Arc::new(each_args.block);
Ok(input
.then(move |input| {
let block = block.clone();
let scope = scope.clone();
let head = head.clone();
let context = context.clone();
async {
match process_row(block, scope, head, context, input).await {
Ok(s) => s,
Err(e) => OutputStream::one(Err(e)),
}
}
})
.flatten()
.to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Each;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Each {})
}
}

View File

@ -0,0 +1,122 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::hir::Operator;
use nu_protocol::{
Primitive, RangeInclusion, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
};
pub struct Echo;
#[derive(Deserialize)]
pub struct EchoArgs {
pub rest: Vec<Value>,
}
#[async_trait]
impl WholeStreamCommand for Echo {
fn name(&self) -> &str {
"echo"
}
fn signature(&self) -> Signature {
Signature::build("echo").rest(SyntaxShape::Any, "the values to echo")
}
fn usage(&self) -> &str {
"Echo the arguments back to the user."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
echo(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Put a hello message in the pipeline",
example: "echo 'hello'",
result: Some(vec![Value::from("hello")]),
},
Example {
description: "Print the value of the special '$nu' variable",
example: "echo $nu",
result: None,
},
]
}
}
async fn echo(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (args, _): (EchoArgs, _) = args.process(&registry).await?;
let stream = args.rest.into_iter().map(|i| {
match i.as_string() {
Ok(s) => {
OutputStream::one(Ok(ReturnSuccess::Value(
UntaggedValue::string(s).into_value(i.tag.clone()),
)))
}
_ => match i {
Value {
value: UntaggedValue::Table(table),
..
} => {
futures::stream::iter(table.into_iter().map(ReturnSuccess::value)).to_output_stream()
}
Value {
value: UntaggedValue::Primitive(Primitive::Range(range)),
tag
} => {
let mut output_vec = vec![];
let mut current = range.from.0.item;
while current != range.to.0.item {
output_vec.push(Ok(ReturnSuccess::Value(UntaggedValue::Primitive(current.clone()).into_value(&tag))));
current = match crate::data::value::compute_values(Operator::Plus, &UntaggedValue::Primitive(current), &UntaggedValue::int(1)) {
Ok(result) => match result {
UntaggedValue::Primitive(p) => p,
_ => {
return OutputStream::one(Err(ShellError::unimplemented("Internal error: expected a primitive result from increment")));
}
},
Err((left_type, right_type)) => {
return OutputStream::one(Err(ShellError::coerce_error(
left_type.spanned(tag.span),
right_type.spanned(tag.span),
)));
}
}
}
if let RangeInclusion::Inclusive = range.to.1 {
output_vec.push(Ok(ReturnSuccess::Value(UntaggedValue::Primitive(current).into_value(&tag))));
}
futures::stream::iter(output_vec.into_iter()).to_output_stream()
}
_ => {
OutputStream::one(Ok(ReturnSuccess::Value(i.clone())))
}
},
}
});
Ok(futures::stream::iter(stream).flatten().to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Echo;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Echo {})
}
}

View File

@ -0,0 +1,209 @@
use crate::commands::UnevaluatedCallInfo;
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{
CommandAction, Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
};
use nu_source::Tagged;
use std::path::PathBuf;
pub struct Enter;
#[derive(Deserialize)]
pub struct EnterArgs {
location: Tagged<PathBuf>,
encoding: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for Enter {
fn name(&self) -> &str {
"enter"
}
fn signature(&self) -> Signature {
Signature::build("enter")
.required(
"location",
SyntaxShape::Path,
"the location to create a new shell from",
)
.named(
"encoding",
SyntaxShape::String,
"encoding to use to open file",
Some('e'),
)
}
fn usage(&self) -> &str {
r#"Create a new shell and begin at this path.
Multiple encodings are supported for reading text files by using
the '--encoding <encoding>' parameter. Here is an example of a few:
big5, euc-jp, euc-kr, gbk, iso-8859-1, utf-16, cp1252, latin5
For a more complete list of encodings please refer to the encoding_rs
documentation link at https://docs.rs/encoding_rs/0.8.23/encoding_rs/#statics"#
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
enter(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Enter a path as a new shell",
example: "enter ../projectB",
result: None,
},
Example {
description: "Enter a file as a new shell",
example: "enter package.json",
result: None,
},
Example {
description: "Enters file with iso-8859-1 encoding",
example: "enter file.csv --encoding iso-8859-1",
result: None,
},
]
}
}
async fn enter(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let scope = raw_args.call_info.scope.clone();
let shell_manager = raw_args.shell_manager.clone();
let head = raw_args.call_info.args.head.clone();
let ctrl_c = raw_args.ctrl_c.clone();
let current_errors = raw_args.current_errors.clone();
let host = raw_args.host.clone();
let tag = raw_args.call_info.name_tag.clone();
let (EnterArgs { location, encoding }, _) = raw_args.process(&registry).await?;
let location_string = location.display().to_string();
let location_clone = location_string.clone();
if location_string.starts_with("help") {
let spec = location_string.split(':').collect::<Vec<&str>>();
if spec.len() == 2 {
let (_, command) = (spec[0], spec[1]);
if registry.has(command) {
return Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterHelpShell(
UntaggedValue::string(command).into_value(Tag::unknown()),
),
)));
}
}
Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterHelpShell(UntaggedValue::nothing().into_value(Tag::unknown())),
)))
} else if location.is_dir() {
Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterShell(location_clone),
)))
} else {
// If it's a file, attempt to open the file as a value and enter it
let cwd = shell_manager.path();
let full_path = std::path::PathBuf::from(cwd);
let (file_extension, contents, contents_tag) = crate::commands::open::fetch(
&full_path,
&PathBuf::from(location_clone),
tag.span,
match encoding {
Some(e) => e.to_string(),
_ => "".to_string(),
},
)
.await?;
match contents {
UntaggedValue::Primitive(Primitive::String(_)) => {
let tagged_contents = contents.into_value(&contents_tag);
if let Some(extension) = file_extension {
let command_name = format!("from {}", extension);
if let Some(converter) = registry.get_command(&command_name) {
let new_args = RawCommandArgs {
host,
ctrl_c,
current_errors,
shell_manager,
call_info: UnevaluatedCallInfo {
args: nu_protocol::hir::Call {
head,
positional: None,
named: None,
span: Span::unknown(),
is_last: false,
},
name_tag: tag.clone(),
scope: scope.clone(),
},
};
let mut result = converter
.run(new_args.with_input(vec![tagged_contents]), &registry)
.await?;
let result_vec: Vec<Result<ReturnSuccess, ShellError>> =
result.drain_vec().await;
Ok(futures::stream::iter(result_vec.into_iter().map(
move |res| match res {
Ok(ReturnSuccess::Value(Value { value, .. })) => Ok(
ReturnSuccess::Action(CommandAction::EnterValueShell(Value {
value,
tag: contents_tag.clone(),
})),
),
x => x,
},
))
.to_output_stream())
} else {
Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterValueShell(tagged_contents),
)))
}
} else {
Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterValueShell(tagged_contents),
)))
}
}
_ => {
let tagged_contents = contents.into_value(contents_tag);
Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterValueShell(tagged_contents),
)))
}
}
}
}
#[cfg(test)]
mod tests {
use super::Enter;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Enter {})
}
}

View File

@ -0,0 +1,83 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::utils::data_processing::{evaluate, fetch};
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::{SpannedItem, Tagged};
use nu_value_ext::ValueExt;
pub struct EvaluateBy;
#[derive(Deserialize)]
pub struct EvaluateByArgs {
evaluate_with: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for EvaluateBy {
fn name(&self) -> &str {
"evaluate-by"
}
fn signature(&self) -> Signature {
Signature::build("evaluate-by").named(
"evaluate_with",
SyntaxShape::String,
"the name of the column to evaluate by",
Some('w'),
)
}
fn usage(&self) -> &str {
"Creates a new table with the data from the tables rows evaluated by the column given."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
evaluate_by(args, registry).await
}
}
pub async fn evaluate_by(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (EvaluateByArgs { evaluate_with }, mut input) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await;
if values.is_empty() {
Err(ShellError::labeled_error(
"Expected table from pipeline",
"requires a table input",
name,
))
} else {
let evaluate_with = if let Some(evaluator) = evaluate_with {
Some(evaluator.item().clone())
} else {
None
};
match evaluate(&values[0], evaluate_with, name) {
Ok(evaluated) => Ok(OutputStream::one(ReturnSuccess::value(evaluated))),
Err(err) => Err(err),
}
}
}
#[cfg(test)]
mod tests {
use super::EvaluateBy;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(EvaluateBy {})
}
}

View File

@ -0,0 +1,105 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
pub struct Every;
#[derive(Deserialize)]
pub struct EveryArgs {
stride: Tagged<u64>,
skip: Tagged<bool>,
}
#[async_trait]
impl WholeStreamCommand for Every {
fn name(&self) -> &str {
"every"
}
fn signature(&self) -> Signature {
Signature::build(self.name())
.required(
"stride",
SyntaxShape::Int,
"how many rows to skip between (and including) each row returned",
)
.switch(
"skip",
"skip the rows that would be returned, instead of selecting them",
Some('s'),
)
}
fn usage(&self) -> &str {
"Show (or skip) every n-th row, starting from the first one."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
every(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Get every second row",
example: "echo [1 2 3 4 5] | every 2",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(3).into(),
UntaggedValue::int(5).into(),
]),
},
Example {
description: "Skip every second row",
example: "echo [1 2 3 4 5] | every 2 --skip",
result: Some(vec![
UntaggedValue::int(2).into(),
UntaggedValue::int(4).into(),
]),
},
]
}
}
async fn every(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (EveryArgs { stride, skip }, input) = args.process(&registry).await?;
let v: Vec<_> = input.into_vec().await;
let stride_desired = if stride.item < 1 { 1 } else { stride.item } as usize;
let mut values_vec_deque = VecDeque::new();
for (i, x) in v.iter().enumerate() {
let should_include = if skip.item {
i % stride_desired != 0
} else {
i % stride_desired == 0
};
if should_include {
values_vec_deque.push_back(ReturnSuccess::value(x.clone()));
}
}
Ok(futures::stream::iter(values_vec_deque).to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Every;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Every {})
}
}

View File

@ -0,0 +1,73 @@
use crate::commands::command::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{CommandAction, ReturnSuccess, Signature};
pub struct Exit;
#[async_trait]
impl WholeStreamCommand for Exit {
fn name(&self) -> &str {
"exit"
}
fn signature(&self) -> Signature {
Signature::build("exit").switch("now", "exit out of the shell immediately", Some('n'))
}
fn usage(&self) -> &str {
"Exit the current shell (or all shells)"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
exit(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Exit the current shell",
example: "exit",
result: None,
},
Example {
description: "Exit all shells (exiting Nu)",
example: "exit --now",
result: None,
},
]
}
}
pub async fn exit(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let command_action = if args.call_info.args.has("now") {
CommandAction::Exit
} else {
CommandAction::LeaveShell
};
Ok(OutputStream::one(ReturnSuccess::action(command_action)))
}
#[cfg(test)]
mod tests {
use super::Exit;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Exit {})
}
}

View File

@ -0,0 +1,82 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
pub struct First;
#[derive(Deserialize)]
pub struct FirstArgs {
rows: Option<Tagged<usize>>,
}
#[async_trait]
impl WholeStreamCommand for First {
fn name(&self) -> &str {
"first"
}
fn signature(&self) -> Signature {
Signature::build("first").optional(
"rows",
SyntaxShape::Int,
"starting from the front, the number of rows to return",
)
}
fn usage(&self) -> &str {
"Show only the first number of rows."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
first(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Return the first item of a list/table",
example: "echo [1 2 3] | first",
result: Some(vec![UntaggedValue::int(1).into()]),
},
Example {
description: "Return the first 2 items of a list/table",
example: "echo [1 2 3] | first 2",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(2).into(),
]),
},
]
}
}
async fn first(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (FirstArgs { rows }, input) = args.process(&registry).await?;
let rows_desired = if let Some(quantity) = rows {
*quantity
} else {
1
};
Ok(input.take(rows_desired).to_output_stream())
}
#[cfg(test)]
mod tests {
use super::First;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(First {})
}
}

View File

@ -0,0 +1,163 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::evaluate::evaluate_baseline_expr;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
use std::borrow::Borrow;
pub struct Format;
#[derive(Deserialize)]
pub struct FormatArgs {
pattern: Tagged<String>,
}
#[async_trait]
impl WholeStreamCommand for Format {
fn name(&self) -> &str {
"format"
}
fn signature(&self) -> Signature {
Signature::build("format").required(
"pattern",
SyntaxShape::String,
"the pattern to output. Eg) \"{foo}: {bar}\"",
)
}
fn usage(&self) -> &str {
"Format columns into a string using a simple pattern."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
format_command(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Print filenames with their sizes",
example: "ls | format '{name}: {size}'",
result: None,
}]
}
}
async fn format_command(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = Arc::new(registry.clone());
let scope = Arc::new(args.call_info.scope.clone());
let (FormatArgs { pattern }, input) = args.process(&registry).await?;
let format_pattern = format(&pattern);
let commands = Arc::new(format_pattern);
Ok(input
.then(move |value| {
let mut output = String::new();
let commands = commands.clone();
let registry = registry.clone();
let scope = scope.clone();
async move {
for command in &*commands {
match command {
FormatCommand::Text(s) => {
output.push_str(&s);
}
FormatCommand::Column(c) => {
// FIXME: use the correct spans
let full_column_path = nu_parser::parse_full_column_path(
&(c.to_string()).spanned(Span::unknown()),
&*registry,
);
let result = evaluate_baseline_expr(
&full_column_path.0,
&registry,
&value,
&scope.vars,
&scope.env,
)
.await;
if let Ok(c) = result {
output
.push_str(&value::format_leaf(c.borrow()).plain_string(100_000))
} else {
// That column doesn't match, so don't emit anything
}
}
}
}
ReturnSuccess::value(UntaggedValue::string(output).into_untagged_value())
}
})
.to_output_stream())
}
#[derive(Debug)]
enum FormatCommand {
Text(String),
Column(String),
}
fn format(input: &str) -> Vec<FormatCommand> {
let mut output = vec![];
let mut loop_input = input.chars();
loop {
let mut before = String::new();
while let Some(c) = loop_input.next() {
if c == '{' {
break;
}
before.push(c);
}
if !before.is_empty() {
output.push(FormatCommand::Text(before.to_string()));
}
// Look for column as we're now at one
let mut column = String::new();
while let Some(c) = loop_input.next() {
if c == '}' {
break;
}
column.push(c);
}
if !column.is_empty() {
output.push(FormatCommand::Column(column.to_string()));
}
if before.is_empty() && column.is_empty() {
break;
}
}
output
}
#[cfg(test)]
mod tests {
use super::Format;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Format {})
}
}

View File

@ -0,0 +1,45 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
pub struct From;
#[async_trait]
impl WholeStreamCommand for From {
fn name(&self) -> &str {
"from"
}
fn signature(&self) -> Signature {
Signature::build("from")
}
fn usage(&self) -> &str {
"Parse content (string or binary) as a table (input format based on subcommand, like csv, ini, json, toml)"
}
async fn run(
&self,
_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(crate::commands::help::get_help(&From, &registry))
.into_value(Tag::unknown()),
)))
}
}
#[cfg(test)]
mod tests {
use super::From;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(From {})
}
}

View File

@ -0,0 +1,244 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use bson::{decode_document, spec::BinarySubtype, Bson};
use nu_errors::{ExpectedRange, ShellError};
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
use nu_source::SpannedItem;
use std::str::FromStr;
pub struct FromBSON;
#[async_trait]
impl WholeStreamCommand for FromBSON {
fn name(&self) -> &str {
"from bson"
}
fn signature(&self) -> Signature {
Signature::build("from bson")
}
fn usage(&self) -> &str {
"Parse binary as .bson and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_bson(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Convert bson data to a table",
example: "open file.bin | from bson",
result: None,
}]
}
}
fn bson_array(input: &[Bson], tag: Tag) -> Result<Vec<Value>, ShellError> {
let mut out = vec![];
for value in input {
out.push(convert_bson_value_to_nu_value(value, &tag)?);
}
Ok(out)
}
fn convert_bson_value_to_nu_value(v: &Bson, tag: impl Into<Tag>) -> Result<Value, ShellError> {
let tag = tag.into();
let span = tag.span;
Ok(match v {
Bson::FloatingPoint(n) => UntaggedValue::Primitive(Primitive::from(*n)).into_value(&tag),
Bson::String(s) => {
UntaggedValue::Primitive(Primitive::String(String::from(s))).into_value(&tag)
}
Bson::Array(a) => UntaggedValue::Table(bson_array(a, tag.clone())?).into_value(&tag),
Bson::Document(doc) => {
let mut collected = TaggedDictBuilder::new(tag.clone());
for (k, v) in doc.iter() {
collected.insert_value(k.clone(), convert_bson_value_to_nu_value(v, &tag)?);
}
collected.into_value()
}
Bson::Boolean(b) => UntaggedValue::Primitive(Primitive::Boolean(*b)).into_value(&tag),
Bson::Null => UntaggedValue::Primitive(Primitive::Nothing).into_value(&tag),
Bson::RegExp(r, opts) => {
let mut collected = TaggedDictBuilder::new(tag.clone());
collected.insert_value(
"$regex".to_string(),
UntaggedValue::Primitive(Primitive::String(String::from(r))).into_value(&tag),
);
collected.insert_value(
"$options".to_string(),
UntaggedValue::Primitive(Primitive::String(String::from(opts))).into_value(&tag),
);
collected.into_value()
}
Bson::I32(n) => UntaggedValue::int(*n).into_value(&tag),
Bson::I64(n) => UntaggedValue::int(*n).into_value(&tag),
Bson::Decimal128(n) => {
// TODO: this really isn't great, and we should update this to do a higher
// fidelity translation
let decimal = BigDecimal::from_str(&format!("{}", n)).map_err(|_| {
ShellError::range_error(
ExpectedRange::BigDecimal,
&n.spanned(span),
"converting BSON Decimal128 to BigDecimal".to_owned(),
)
})?;
UntaggedValue::Primitive(Primitive::Decimal(decimal)).into_value(&tag)
}
Bson::JavaScriptCode(js) => {
let mut collected = TaggedDictBuilder::new(tag.clone());
collected.insert_value(
"$javascript".to_string(),
UntaggedValue::Primitive(Primitive::String(String::from(js))).into_value(&tag),
);
collected.into_value()
}
Bson::JavaScriptCodeWithScope(js, doc) => {
let mut collected = TaggedDictBuilder::new(tag.clone());
collected.insert_value(
"$javascript".to_string(),
UntaggedValue::Primitive(Primitive::String(String::from(js))).into_value(&tag),
);
collected.insert_value(
"$scope".to_string(),
convert_bson_value_to_nu_value(&Bson::Document(doc.to_owned()), tag)?,
);
collected.into_value()
}
Bson::TimeStamp(ts) => {
let mut collected = TaggedDictBuilder::new(tag.clone());
collected.insert_value(
"$timestamp".to_string(),
UntaggedValue::int(*ts).into_value(&tag),
);
collected.into_value()
}
Bson::Binary(bst, bytes) => {
let mut collected = TaggedDictBuilder::new(tag.clone());
collected.insert_value(
"$binary_subtype".to_string(),
match bst {
BinarySubtype::UserDefined(u) => UntaggedValue::int(*u),
_ => {
UntaggedValue::Primitive(Primitive::String(binary_subtype_to_string(*bst)))
}
}
.into_value(&tag),
);
collected.insert_value(
"$binary".to_string(),
UntaggedValue::Primitive(Primitive::Binary(bytes.to_owned())).into_value(&tag),
);
collected.into_value()
}
Bson::ObjectId(obj_id) => {
let mut collected = TaggedDictBuilder::new(tag.clone());
collected.insert_value(
"$object_id".to_string(),
UntaggedValue::Primitive(Primitive::String(obj_id.to_hex())).into_value(&tag),
);
collected.into_value()
}
Bson::UtcDatetime(dt) => UntaggedValue::Primitive(Primitive::Date(*dt)).into_value(&tag),
Bson::Symbol(s) => {
let mut collected = TaggedDictBuilder::new(tag.clone());
collected.insert_value(
"$symbol".to_string(),
UntaggedValue::Primitive(Primitive::String(String::from(s))).into_value(&tag),
);
collected.into_value()
}
})
}
fn binary_subtype_to_string(bst: BinarySubtype) -> String {
match bst {
BinarySubtype::Generic => "generic",
BinarySubtype::Function => "function",
BinarySubtype::BinaryOld => "binary_old",
BinarySubtype::UuidOld => "uuid_old",
BinarySubtype::Uuid => "uuid",
BinarySubtype::Md5 => "md5",
_ => unreachable!(),
}
.to_string()
}
#[derive(Debug)]
struct BytesReader {
pos: usize,
inner: Vec<u8>,
}
impl BytesReader {
fn new(bytes: Vec<u8>) -> BytesReader {
BytesReader {
pos: 0,
inner: bytes,
}
}
}
impl std::io::Read for BytesReader {
fn read(&mut self, buf: &mut [u8]) -> std::io::Result<usize> {
let src: &mut &[u8] = &mut self.inner[self.pos..].as_ref();
let diff = src.read(buf)?;
self.pos += diff;
Ok(diff)
}
}
pub fn from_bson_bytes_to_value(bytes: Vec<u8>, tag: impl Into<Tag>) -> Result<Value, ShellError> {
let mut docs = Vec::new();
let mut b_reader = BytesReader::new(bytes);
while let Ok(v) = decode_document(&mut b_reader) {
docs.push(Bson::Document(v));
}
convert_bson_value_to_nu_value(&Bson::Array(docs), tag)
}
async fn from_bson(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let bytes = input.collect_binary(tag.clone()).await?;
match from_bson_bytes_to_value(bytes.item, tag.clone()) {
Ok(x) => Ok(OutputStream::one(ReturnSuccess::value(x))),
Err(_) => Err(ShellError::labeled_error_with_secondary(
"Could not parse as BSON",
"input cannot be parsed as BSON",
tag.clone(),
"value originates from here",
bytes.tag,
)),
}
}
#[cfg(test)]
mod tests {
use super::FromBSON;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromBSON {})
}
}

View File

@ -0,0 +1,119 @@
use crate::commands::from_delimited_data::from_delimited_data;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, Signature, SyntaxShape, UntaggedValue, Value};
pub struct FromCSV;
#[derive(Deserialize)]
pub struct FromCSVArgs {
headerless: bool,
separator: Option<Value>,
}
#[async_trait]
impl WholeStreamCommand for FromCSV {
fn name(&self) -> &str {
"from csv"
}
fn signature(&self) -> Signature {
Signature::build("from csv")
.named(
"separator",
SyntaxShape::String,
"a character to separate columns, defaults to ','",
Some('s'),
)
.switch(
"headerless",
"don't treat the first row as column names",
None,
)
}
fn usage(&self) -> &str {
"Parse text as .csv and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_csv(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Convert comma-separated data to a table",
example: "open data.txt | from csv",
result: None,
},
Example {
description: "Convert comma-separated data to a table, ignoring headers",
example: "open data.txt | from csv --headerless",
result: None,
},
Example {
description: "Convert semicolon-separated data to a table",
example: "open data.txt | from csv --separator ';'",
result: None,
},
]
}
}
async fn from_csv(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (
FromCSVArgs {
headerless,
separator,
},
input,
) = args.process(&registry).await?;
let sep = match separator {
Some(Value {
value: UntaggedValue::Primitive(Primitive::String(s)),
tag,
..
}) => {
if s == r"\t" {
'\t'
} else {
let vec_s: Vec<char> = s.chars().collect();
if vec_s.len() != 1 {
return Err(ShellError::labeled_error(
"Expected a single separator char from --separator",
"requires a single character string input",
tag,
));
};
vec_s[0]
}
}
_ => ',',
};
from_delimited_data(headerless, sep, "CSV", input, name).await
}
#[cfg(test)]
mod tests {
use super::FromCSV;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromCSV {})
}
}

View File

@ -0,0 +1,101 @@
use crate::prelude::*;
use csv::{ErrorKind, ReaderBuilder};
use nu_errors::ShellError;
use nu_protocol::{TaggedDictBuilder, UntaggedValue, Value};
fn from_delimited_string_to_value(
s: String,
headerless: bool,
separator: char,
tag: impl Into<Tag>,
) -> Result<Value, csv::Error> {
let mut reader = ReaderBuilder::new()
.has_headers(!headerless)
.delimiter(separator as u8)
.from_reader(s.as_bytes());
let tag = tag.into();
let headers = if headerless {
(1..=reader.headers()?.len())
.map(|i| format!("Column{}", i))
.collect::<Vec<String>>()
} else {
reader.headers()?.iter().map(String::from).collect()
};
let mut rows = vec![];
for row in reader.records() {
let mut tagged_row = TaggedDictBuilder::new(&tag);
for (value, header) in row?.iter().zip(headers.iter()) {
if let Ok(i) = value.parse::<i64>() {
tagged_row.insert_value(header, UntaggedValue::int(i).into_value(&tag))
} else if let Ok(f) = value.parse::<f64>() {
tagged_row.insert_value(header, UntaggedValue::decimal(f).into_value(&tag))
} else {
tagged_row.insert_value(header, UntaggedValue::string(value).into_value(&tag))
}
}
rows.push(tagged_row.into_value());
}
Ok(UntaggedValue::Table(rows).into_value(&tag))
}
pub async fn from_delimited_data(
headerless: bool,
sep: char,
format_name: &'static str,
input: InputStream,
name: Tag,
) -> Result<OutputStream, ShellError> {
let name_tag = name;
let concat_string = input.collect_string(name_tag.clone()).await?;
match from_delimited_string_to_value(concat_string.item, headerless, sep, name_tag.clone()) {
Ok(x) => match x {
Value {
value: UntaggedValue::Table(list),
..
} => Ok(futures::stream::iter(list).to_output_stream()),
x => Ok(OutputStream::one(x)),
},
Err(err) => {
let line_one = match pretty_csv_error(err) {
Some(pretty) => format!("Could not parse as {} ({})", format_name, pretty),
None => format!("Could not parse as {}", format_name),
};
let line_two = format!("input cannot be parsed as {}", format_name);
Err(ShellError::labeled_error_with_secondary(
line_one,
line_two,
name_tag.clone(),
"value originates from here",
concat_string.tag,
))
}
}
}
fn pretty_csv_error(err: csv::Error) -> Option<String> {
match err.kind() {
ErrorKind::UnequalLengths {
pos,
expected_len,
len,
} => {
if let Some(pos) = pos {
Some(format!(
"Line {}: expected {} fields, found {}",
pos.line(),
expected_len,
len
))
} else {
Some(format!("Expected {} fields, found {}", expected_len, len))
}
}
ErrorKind::Seek => Some("Internal error while parsing csv".to_string()),
_ => None,
}
}

View File

@ -0,0 +1,140 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use ::eml_parser::eml::*;
use ::eml_parser::EmlParser;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue};
use nu_source::Tagged;
pub struct FromEML;
const DEFAULT_BODY_PREVIEW: usize = 50;
#[derive(Deserialize, Clone)]
pub struct FromEMLArgs {
#[serde(rename(deserialize = "preview-body"))]
preview_body: Option<Tagged<usize>>,
}
#[async_trait]
impl WholeStreamCommand for FromEML {
fn name(&self) -> &str {
"from eml"
}
fn signature(&self) -> Signature {
Signature::build("from eml").named(
"preview-body",
SyntaxShape::Int,
"How many bytes of the body to preview",
Some('b'),
)
}
fn usage(&self) -> &str {
"Parse text as .eml and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_eml(args, registry).await
}
}
fn emailaddress_to_value(tag: &Tag, email_address: &EmailAddress) -> TaggedDictBuilder {
let mut dict = TaggedDictBuilder::with_capacity(tag, 2);
let (n, a) = match email_address {
EmailAddress::AddressOnly { address } => {
(UntaggedValue::nothing(), UntaggedValue::string(address))
}
EmailAddress::NameAndEmailAddress { name, address } => {
(UntaggedValue::string(name), UntaggedValue::string(address))
}
};
dict.insert_untagged("Name", n);
dict.insert_untagged("Address", a);
dict
}
fn headerfieldvalue_to_value(tag: &Tag, value: &HeaderFieldValue) -> UntaggedValue {
use HeaderFieldValue::*;
match value {
SingleEmailAddress(address) => emailaddress_to_value(tag, address).into_untagged_value(),
MultipleEmailAddresses(addresses) => UntaggedValue::Table(
addresses
.iter()
.map(|a| emailaddress_to_value(tag, a).into_value())
.collect(),
),
Unstructured(s) => UntaggedValue::string(s),
Empty => UntaggedValue::nothing(),
}
}
async fn from_eml(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let registry = registry.clone();
let (eml_args, input): (FromEMLArgs, _) = args.process(&registry).await?;
let value = input.collect_string(tag.clone()).await?;
let body_preview = eml_args
.preview_body
.map(|b| b.item)
.unwrap_or(DEFAULT_BODY_PREVIEW);
let eml = EmlParser::from_string(value.item)
.with_body_preview(body_preview)
.parse()
.map_err(|_| {
ShellError::labeled_error(
"Could not parse .eml file",
"could not parse .eml file",
&tag,
)
})?;
let mut dict = TaggedDictBuilder::new(&tag);
if let Some(subj) = eml.subject {
dict.insert_untagged("Subject", UntaggedValue::string(subj));
}
if let Some(from) = eml.from {
dict.insert_untagged("From", headerfieldvalue_to_value(&tag, &from));
}
if let Some(to) = eml.to {
dict.insert_untagged("To", headerfieldvalue_to_value(&tag, &to));
}
for HeaderField { name, value } in eml.headers.iter() {
dict.insert_untagged(name, headerfieldvalue_to_value(&tag, &value));
}
if let Some(body) = eml.body {
dict.insert_untagged("Body", UntaggedValue::string(body));
}
Ok(OutputStream::one(ReturnSuccess::value(dict.into_value())))
}
#[cfg(test)]
mod tests {
use super::FromEML;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromEML {})
}
}

View File

@ -0,0 +1,259 @@
extern crate ical;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use ical::parser::ical::component::*;
use ical::property::Property;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
use std::io::BufReader;
pub struct FromIcs;
#[async_trait]
impl WholeStreamCommand for FromIcs {
fn name(&self) -> &str {
"from ics"
}
fn signature(&self) -> Signature {
Signature::build("from ics")
}
fn usage(&self) -> &str {
"Parse text as .ics and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_ics(args, registry).await
}
}
async fn from_ics(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let input_string = input.collect_string(tag.clone()).await?.item;
let input_bytes = input_string.as_bytes();
let buf_reader = BufReader::new(input_bytes);
let parser = ical::IcalParser::new(buf_reader);
// TODO: it should be possible to make this a stream, but the some of the lifetime requirements make this tricky.
// Pre-computing for now
let mut output = vec![];
for calendar in parser {
match calendar {
Ok(c) => output.push(ReturnSuccess::value(calendar_to_value(c, tag.clone()))),
Err(_) => output.push(Err(ShellError::labeled_error(
"Could not parse as .ics",
"input cannot be parsed as .ics",
tag.clone(),
))),
}
}
Ok(futures::stream::iter(output).to_output_stream())
}
fn calendar_to_value(calendar: IcalCalendar, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(calendar.properties, tag.clone()),
);
row.insert_untagged("events", events_to_value(calendar.events, tag.clone()));
row.insert_untagged("alarms", alarms_to_value(calendar.alarms, tag.clone()));
row.insert_untagged("to-Dos", todos_to_value(calendar.todos, tag.clone()));
row.insert_untagged(
"journals",
journals_to_value(calendar.journals, tag.clone()),
);
row.insert_untagged(
"free-busys",
free_busys_to_value(calendar.free_busys, tag.clone()),
);
row.insert_untagged("timezones", timezones_to_value(calendar.timezones, tag));
row.into_value()
}
fn events_to_value(events: Vec<IcalEvent>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&events
.into_iter()
.map(|event| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(event.properties, tag.clone()),
);
row.insert_untagged("alarms", alarms_to_value(event.alarms, tag.clone()));
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn alarms_to_value(alarms: Vec<IcalAlarm>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&alarms
.into_iter()
.map(|alarm| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(alarm.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn todos_to_value(todos: Vec<IcalTodo>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&todos
.into_iter()
.map(|todo| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(todo.properties, tag.clone()),
);
row.insert_untagged("alarms", alarms_to_value(todo.alarms, tag.clone()));
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn journals_to_value(journals: Vec<IcalJournal>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&journals
.into_iter()
.map(|journal| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(journal.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn free_busys_to_value(free_busys: Vec<IcalFreeBusy>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&free_busys
.into_iter()
.map(|free_busy| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(free_busy.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn timezones_to_value(timezones: Vec<IcalTimeZone>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&timezones
.into_iter()
.map(|timezone| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(timezone.properties, tag.clone()),
);
row.insert_untagged(
"transitions",
timezone_transitions_to_value(timezone.transitions, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn timezone_transitions_to_value(
transitions: Vec<IcalTimeZoneTransition>,
tag: Tag,
) -> UntaggedValue {
UntaggedValue::table(
&transitions
.into_iter()
.map(|transition| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(transition.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn properties_to_value(properties: Vec<Property>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&properties
.into_iter()
.map(|prop| {
let mut row = TaggedDictBuilder::new(tag.clone());
let name = UntaggedValue::string(prop.name);
let value = match prop.value {
Some(val) => UntaggedValue::string(val),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
let params = match prop.params {
Some(param_list) => params_to_value(param_list, tag.clone()).into(),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
row.insert_untagged("name", name);
row.insert_untagged("value", value);
row.insert_untagged("params", params);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn params_to_value(params: Vec<(String, Vec<String>)>, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag);
for (param_name, param_values) in params {
let values: Vec<Value> = param_values.into_iter().map(|val| val.into()).collect();
let values = UntaggedValue::table(&values);
row.insert_untagged(param_name, values);
}
row.into_value()
}
#[cfg(test)]
mod tests {
use super::FromIcs;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromIcs {})
}
}

View File

@ -0,0 +1,105 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, Signature, TaggedDictBuilder, UntaggedValue, Value};
use std::collections::HashMap;
pub struct FromINI;
#[async_trait]
impl WholeStreamCommand for FromINI {
fn name(&self) -> &str {
"from ini"
}
fn signature(&self) -> Signature {
Signature::build("from ini")
}
fn usage(&self) -> &str {
"Parse text as .ini and create table"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_ini(args, registry).await
}
}
fn convert_ini_second_to_nu_value(v: &HashMap<String, String>, tag: impl Into<Tag>) -> Value {
let mut second = TaggedDictBuilder::new(tag);
for (key, value) in v.iter() {
second.insert_untagged(key.clone(), Primitive::String(value.clone()));
}
second.into_value()
}
fn convert_ini_top_to_nu_value(
v: &HashMap<String, HashMap<String, String>>,
tag: impl Into<Tag>,
) -> Value {
let tag = tag.into();
let mut top_level = TaggedDictBuilder::new(tag.clone());
for (key, value) in v.iter() {
top_level.insert_value(
key.clone(),
convert_ini_second_to_nu_value(value, tag.clone()),
);
}
top_level.into_value()
}
pub fn from_ini_string_to_value(
s: String,
tag: impl Into<Tag>,
) -> Result<Value, serde_ini::de::Error> {
let v: HashMap<String, HashMap<String, String>> = serde_ini::from_str(&s)?;
Ok(convert_ini_top_to_nu_value(&v, tag))
}
async fn from_ini(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let concat_string = input.collect_string(tag.clone()).await?;
match from_ini_string_to_value(concat_string.item, tag.clone()) {
Ok(x) => match x {
Value {
value: UntaggedValue::Table(list),
..
} => Ok(futures::stream::iter(list).to_output_stream()),
x => Ok(OutputStream::one(x)),
},
Err(_) => Err(ShellError::labeled_error_with_secondary(
"Could not parse as INI",
"input cannot be parsed as INI",
&tag,
"value originates from here",
concat_string.tag,
)),
}
}
#[cfg(test)]
mod tests {
use super::FromINI;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromINI {})
}
}

View File

@ -0,0 +1,153 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
pub struct FromJSON;
#[derive(Deserialize)]
pub struct FromJSONArgs {
objects: bool,
}
#[async_trait]
impl WholeStreamCommand for FromJSON {
fn name(&self) -> &str {
"from json"
}
fn signature(&self) -> Signature {
Signature::build("from json").switch(
"objects",
"treat each line as a separate value",
Some('o'),
)
}
fn usage(&self) -> &str {
"Parse text as .json and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_json(args, registry).await
}
}
fn convert_json_value_to_nu_value(v: &serde_hjson::Value, tag: impl Into<Tag>) -> Value {
let tag = tag.into();
match v {
serde_hjson::Value::Null => UntaggedValue::Primitive(Primitive::Nothing).into_value(&tag),
serde_hjson::Value::Bool(b) => UntaggedValue::boolean(*b).into_value(&tag),
serde_hjson::Value::F64(n) => UntaggedValue::decimal(*n).into_value(&tag),
serde_hjson::Value::U64(n) => UntaggedValue::int(*n).into_value(&tag),
serde_hjson::Value::I64(n) => UntaggedValue::int(*n).into_value(&tag),
serde_hjson::Value::String(s) => {
UntaggedValue::Primitive(Primitive::String(String::from(s))).into_value(&tag)
}
serde_hjson::Value::Array(a) => UntaggedValue::Table(
a.iter()
.map(|x| convert_json_value_to_nu_value(x, &tag))
.collect(),
)
.into_value(tag),
serde_hjson::Value::Object(o) => {
let mut collected = TaggedDictBuilder::new(&tag);
for (k, v) in o.iter() {
collected.insert_value(k.clone(), convert_json_value_to_nu_value(v, &tag));
}
collected.into_value()
}
}
}
pub fn from_json_string_to_value(s: String, tag: impl Into<Tag>) -> serde_hjson::Result<Value> {
let v: serde_hjson::Value = serde_hjson::from_str(&s)?;
Ok(convert_json_value_to_nu_value(&v, tag))
}
async fn from_json(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_tag = args.call_info.name_tag.clone();
let registry = registry.clone();
let (FromJSONArgs { objects }, input) = args.process(&registry).await?;
let concat_string = input.collect_string(name_tag.clone()).await?;
let string_clone: Vec<_> = concat_string.item.lines().map(|x| x.to_string()).collect();
if objects {
Ok(
futures::stream::iter(string_clone.into_iter().filter_map(move |json_str| {
if json_str.is_empty() {
return None;
}
match from_json_string_to_value(json_str, &name_tag) {
Ok(x) => Some(ReturnSuccess::value(x)),
Err(e) => {
let mut message = "Could not parse as JSON (".to_string();
message.push_str(&e.to_string());
message.push_str(")");
Some(Err(ShellError::labeled_error_with_secondary(
message,
"input cannot be parsed as JSON",
name_tag.clone(),
"value originates from here",
concat_string.tag.clone(),
)))
}
}
}))
.to_output_stream(),
)
} else {
match from_json_string_to_value(concat_string.item, name_tag.clone()) {
Ok(x) => match x {
Value {
value: UntaggedValue::Table(list),
..
} => Ok(
futures::stream::iter(list.into_iter().map(ReturnSuccess::value))
.to_output_stream(),
),
x => Ok(OutputStream::one(ReturnSuccess::value(x))),
},
Err(e) => {
let mut message = "Could not parse as JSON (".to_string();
message.push_str(&e.to_string());
message.push_str(")");
Ok(OutputStream::one(Err(
ShellError::labeled_error_with_secondary(
message,
"input cannot be parsed as JSON",
name_tag,
"value originates from here",
concat_string.tag,
),
)))
}
}
}
}
#[cfg(test)]
mod tests {
use super::FromJSON;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromJSON {})
}
}

View File

@ -0,0 +1,111 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::TaggedListBuilder;
use calamine::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue};
use std::io::Cursor;
pub struct FromODS;
#[derive(Deserialize)]
pub struct FromODSArgs {
headerless: bool,
}
#[async_trait]
impl WholeStreamCommand for FromODS {
fn name(&self) -> &str {
"from ods"
}
fn signature(&self) -> Signature {
Signature::build("from ods").switch(
"headerless",
"don't treat the first row as column names",
None,
)
}
fn usage(&self) -> &str {
"Parse OpenDocument Spreadsheet(.ods) data and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_ods(args, registry).await
}
}
async fn from_ods(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let registry = registry.clone();
let (
FromODSArgs {
headerless: _headerless,
},
input,
) = args.process(&registry).await?;
let bytes = input.collect_binary(tag.clone()).await?;
let buf: Cursor<Vec<u8>> = Cursor::new(bytes.item);
let mut ods = Ods::<_>::new(buf).map_err(|_| {
ShellError::labeled_error("Could not load ods file", "could not load ods file", &tag)
})?;
let mut dict = TaggedDictBuilder::new(&tag);
let sheet_names = ods.sheet_names().to_owned();
for sheet_name in &sheet_names {
let mut sheet_output = TaggedListBuilder::new(&tag);
if let Some(Ok(current_sheet)) = ods.worksheet_range(sheet_name) {
for row in current_sheet.rows() {
let mut row_output = TaggedDictBuilder::new(&tag);
for (i, cell) in row.iter().enumerate() {
let value = match cell {
DataType::Empty => UntaggedValue::nothing(),
DataType::String(s) => UntaggedValue::string(s),
DataType::Float(f) => UntaggedValue::decimal(*f),
DataType::Int(i) => UntaggedValue::int(*i),
DataType::Bool(b) => UntaggedValue::boolean(*b),
_ => UntaggedValue::nothing(),
};
row_output.insert_untagged(&format!("Column{}", i), value);
}
sheet_output.push_untagged(row_output.into_untagged_value());
}
dict.insert_untagged(sheet_name, sheet_output.into_untagged_value());
} else {
return Err(ShellError::labeled_error(
"Could not load sheet",
"could not load sheet",
&tag,
));
}
}
Ok(OutputStream::one(ReturnSuccess::value(dict.into_value())))
}
#[cfg(test)]
mod tests {
use super::FromODS;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromODS {})
}
}

View File

@ -1,68 +1,71 @@
use crate::commands::WholeStreamCommand;
use crate::data::{Primitive, TaggedDictBuilder, Value};
use crate::errors::ShellError;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, Signature, TaggedDictBuilder, UntaggedValue, Value};
use rusqlite::{types::ValueRef, Connection, Row, NO_PARAMS};
use std::io::Write;
use std::path::Path;
pub struct FromSQLite;
#[async_trait]
impl WholeStreamCommand for FromSQLite {
fn name(&self) -> &str {
"from-sqlite"
"from sqlite"
}
fn signature(&self) -> Signature {
Signature::build("from-sqlite")
Signature::build("from sqlite")
}
fn usage(&self) -> &str {
"Parse binary data as sqlite .db and create table."
}
fn run(
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_sqlite(args, registry)
from_sqlite(args, registry).await
}
}
pub struct FromDB;
#[async_trait]
impl WholeStreamCommand for FromDB {
fn name(&self) -> &str {
"from-db"
"from db"
}
fn signature(&self) -> Signature {
Signature::build("from-db")
Signature::build("from db")
}
fn usage(&self) -> &str {
"Parse binary data as db and create table."
}
fn run(
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_sqlite(args, registry)
from_sqlite(args, registry).await
}
}
pub fn convert_sqlite_file_to_nu_value(
path: &Path,
tag: impl Into<Tag> + Clone,
) -> Result<Tagged<Value>, rusqlite::Error> {
) -> Result<Value, rusqlite::Error> {
let conn = Connection::open(path)?;
let mut meta_out = Vec::new();
let mut meta_stmt = conn.prepare("select name from sqlite_master where type='table'")?;
let mut meta_rows = meta_stmt.query(NO_PARAMS)?;
while let Some(meta_row) = meta_rows.next()? {
let table_name: String = meta_row.get(0)?;
let mut meta_dict = TaggedDictBuilder::new(tag.clone());
@ -72,48 +75,54 @@ pub fn convert_sqlite_file_to_nu_value(
while let Some(table_row) = table_rows.next()? {
out.push(convert_sqlite_row_to_nu_value(table_row, tag.clone())?)
}
meta_dict.insert_tagged(
meta_dict.insert_value(
"table_name".to_string(),
Value::Primitive(Primitive::String(table_name)).tagged(tag.clone()),
UntaggedValue::Primitive(Primitive::String(table_name)).into_value(tag.clone()),
);
meta_dict.insert_tagged("table_values", Value::Table(out).tagged(tag.clone()));
meta_out.push(meta_dict.into_tagged_value());
meta_dict.insert_value(
"table_values",
UntaggedValue::Table(out).into_value(tag.clone()),
);
meta_out.push(meta_dict.into_value());
}
let tag = tag.into();
Ok(Value::Table(meta_out).tagged(tag))
Ok(UntaggedValue::Table(meta_out).into_value(tag))
}
fn convert_sqlite_row_to_nu_value(
row: &Row,
tag: impl Into<Tag> + Clone,
) -> Result<Tagged<Value>, rusqlite::Error> {
) -> Result<Value, rusqlite::Error> {
let mut collected = TaggedDictBuilder::new(tag.clone());
for (i, c) in row.columns().iter().enumerate() {
collected.insert_tagged(
collected.insert_value(
c.name().to_string(),
convert_sqlite_value_to_nu_value(row.get_raw(i), tag.clone()),
);
}
return Ok(collected.into_tagged_value());
Ok(collected.into_value())
}
fn convert_sqlite_value_to_nu_value(value: ValueRef, tag: impl Into<Tag> + Clone) -> Tagged<Value> {
fn convert_sqlite_value_to_nu_value(value: ValueRef, tag: impl Into<Tag> + Clone) -> Value {
match value {
ValueRef::Null => Value::Primitive(Primitive::String(String::from(""))).tagged(tag),
ValueRef::Integer(i) => Value::number(i).tagged(tag),
ValueRef::Real(f) => Value::number(f).tagged(tag),
t @ ValueRef::Text(_) => {
// this unwrap is safe because we know the ValueRef is Text.
Value::Primitive(Primitive::String(t.as_str().unwrap().to_string())).tagged(tag)
ValueRef::Null => {
UntaggedValue::Primitive(Primitive::String(String::from(""))).into_value(tag)
}
ValueRef::Blob(u) => Value::binary(u.to_owned()).tagged(tag),
ValueRef::Integer(i) => UntaggedValue::int(i).into_value(tag),
ValueRef::Real(f) => UntaggedValue::decimal(f).into_value(tag),
ValueRef::Text(s) => {
// this unwrap is safe because we know the ValueRef is Text.
UntaggedValue::Primitive(Primitive::String(String::from_utf8_lossy(s).to_string()))
.into_value(tag)
}
ValueRef::Blob(u) => UntaggedValue::binary(u.to_owned()).into_value(tag),
}
}
pub fn from_sqlite_bytes_to_value(
mut bytes: Vec<u8>,
tag: impl Into<Tag> + Clone,
) -> Result<Tagged<Value>, std::io::Error> {
) -> Result<Value, std::io::Error> {
// FIXME: should probably write a sqlite virtual filesystem
// that will allow us to use bytes as a file to avoid this
// write out, but this will require C code. Might be
@ -126,48 +135,47 @@ pub fn from_sqlite_bytes_to_value(
}
}
fn from_sqlite(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?;
async fn from_sqlite(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let stream = async_stream_block! {
let values: Vec<Tagged<Value>> = input.values.collect().await;
let bytes = input.collect_binary(tag.clone()).await?;
for value in values {
let value_tag = value.tag();
match value.item {
Value::Primitive(Primitive::Binary(vb)) =>
match from_sqlite_bytes_to_value(vb, tag) {
Ok(x) => match x {
Tagged { item: Value::Table(list), .. } => {
for l in list {
yield ReturnSuccess::value(l);
}
}
_ => yield ReturnSuccess::value(x),
}
Err(_) => {
yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as SQLite",
"input cannot be parsed as SQLite",
tag,
"value originates from here",
value_tag,
))
}
}
_ => yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
tag,
"value originates from here",
value_tag,
)),
match from_sqlite_bytes_to_value(bytes.item, tag.clone()) {
Ok(x) => match x {
Value {
value: UntaggedValue::Table(list),
..
} => Ok(futures::stream::iter(list).to_output_stream()),
_ => Ok(OutputStream::one(x)),
},
Err(err) => {
println!("{:?}", err);
}
Err(ShellError::labeled_error_with_secondary(
"Could not parse as SQLite",
"input cannot be parsed as SQLite",
&tag,
"value originates from here",
bytes.tag,
))
}
};
Ok(stream.to_output_stream())
}
}
#[cfg(test)]
mod tests {
use super::FromSQLite;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromSQLite {})
}
}

View File

@ -0,0 +1,513 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{
Primitive, ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue, Value,
};
use nu_source::Tagged;
pub struct FromSSV;
#[derive(Deserialize)]
pub struct FromSSVArgs {
headerless: bool,
#[serde(rename(deserialize = "aligned-columns"))]
aligned_columns: bool,
#[serde(rename(deserialize = "minimum-spaces"))]
minimum_spaces: Option<Tagged<usize>>,
}
const STRING_REPRESENTATION: &str = "from ssv";
const DEFAULT_MINIMUM_SPACES: usize = 2;
#[async_trait]
impl WholeStreamCommand for FromSSV {
fn name(&self) -> &str {
STRING_REPRESENTATION
}
fn signature(&self) -> Signature {
Signature::build(STRING_REPRESENTATION)
.switch(
"headerless",
"don't treat the first row as column names",
None,
)
.switch("aligned-columns", "assume columns are aligned", Some('a'))
.named(
"minimum-spaces",
SyntaxShape::Int,
"the minimum spaces to separate columns",
Some('m'),
)
}
fn usage(&self) -> &str {
"Parse text as space-separated values and create a table. The default minimum number of spaces counted as a separator is 2."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_ssv(args, registry).await
}
}
enum HeaderOptions<'a> {
WithHeaders(&'a str),
WithoutHeaders,
}
fn parse_aligned_columns<'a>(
lines: impl Iterator<Item = &'a str>,
headers: HeaderOptions,
separator: &str,
) -> Vec<Vec<(String, String)>> {
fn construct<'a>(
lines: impl Iterator<Item = &'a str>,
headers: Vec<(String, usize)>,
) -> Vec<Vec<(String, String)>> {
lines
.map(|l| {
headers
.iter()
.enumerate()
.map(|(i, (header_name, start_position))| {
let val = match headers.get(i + 1) {
Some((_, end)) => {
if *end < l.len() {
l.get(*start_position..*end)
} else {
l.get(*start_position..)
}
}
None => l.get(*start_position..),
}
.unwrap_or("")
.trim()
.into();
(header_name.clone(), val)
})
.collect()
})
.collect()
}
let find_indices = |line: &str| {
let values = line
.split(&separator)
.map(str::trim)
.filter(|s| !s.is_empty());
values
.fold(
(0, vec![]),
|(current_pos, mut indices), value| match line[current_pos..].find(value) {
None => (current_pos, indices),
Some(index) => {
let absolute_index = current_pos + index;
indices.push(absolute_index);
(absolute_index + value.len(), indices)
}
},
)
.1
};
let parse_with_headers = |lines, headers_raw: &str| {
let indices = find_indices(headers_raw);
let headers = headers_raw
.split(&separator)
.map(str::trim)
.filter(|s| !s.is_empty())
.map(String::from)
.zip(indices);
let columns = headers.collect::<Vec<(String, usize)>>();
construct(lines, columns)
};
let parse_without_headers = |ls: Vec<&str>| {
let mut indices = ls
.iter()
.flat_map(|s| find_indices(*s))
.collect::<Vec<usize>>();
indices.sort();
indices.dedup();
let headers: Vec<(String, usize)> = indices
.iter()
.enumerate()
.map(|(i, position)| (format!("Column{}", i + 1), *position))
.collect();
construct(ls.iter().map(|s| s.to_owned()), headers)
};
match headers {
HeaderOptions::WithHeaders(headers_raw) => parse_with_headers(lines, headers_raw),
HeaderOptions::WithoutHeaders => parse_without_headers(lines.collect()),
}
}
fn parse_separated_columns<'a>(
lines: impl Iterator<Item = &'a str>,
headers: HeaderOptions,
separator: &str,
) -> Vec<Vec<(String, String)>> {
fn collect<'a>(
headers: Vec<String>,
rows: impl Iterator<Item = &'a str>,
separator: &str,
) -> Vec<Vec<(String, String)>> {
rows.map(|r| {
headers
.iter()
.zip(r.split(separator).map(str::trim).filter(|s| !s.is_empty()))
.map(|(a, b)| (a.to_owned(), b.to_owned()))
.collect()
})
.collect()
}
let parse_with_headers = |lines, headers_raw: &str| {
let headers = headers_raw
.split(&separator)
.map(str::trim)
.map(str::to_owned)
.filter(|s| !s.is_empty())
.collect();
collect(headers, lines, separator)
};
let parse_without_headers = |ls: Vec<&str>| {
let num_columns = ls.iter().map(|r| r.len()).max().unwrap_or(0);
let headers = (1..=num_columns)
.map(|i| format!("Column{}", i))
.collect::<Vec<String>>();
collect(headers, ls.into_iter(), separator)
};
match headers {
HeaderOptions::WithHeaders(headers_raw) => parse_with_headers(lines, headers_raw),
HeaderOptions::WithoutHeaders => parse_without_headers(lines.collect()),
}
}
fn string_to_table(
s: &str,
headerless: bool,
aligned_columns: bool,
split_at: usize,
) -> Vec<Vec<(String, String)>> {
let mut lines = s.lines().filter(|l| !l.trim().is_empty());
let separator = " ".repeat(std::cmp::max(split_at, 1));
let (ls, header_options) = if headerless {
(lines, HeaderOptions::WithoutHeaders)
} else {
match lines.next() {
Some(header) => (lines, HeaderOptions::WithHeaders(header)),
None => return vec![],
}
};
let f = if aligned_columns {
parse_aligned_columns
} else {
parse_separated_columns
};
f(ls, header_options, &separator)
}
fn from_ssv_string_to_value(
s: &str,
headerless: bool,
aligned_columns: bool,
split_at: usize,
tag: impl Into<Tag>,
) -> Option<Value> {
let tag = tag.into();
let rows = string_to_table(s, headerless, aligned_columns, split_at)
.iter()
.map(|row| {
let mut tagged_dict = TaggedDictBuilder::new(&tag);
for (col, entry) in row {
tagged_dict.insert_value(
col,
UntaggedValue::Primitive(Primitive::String(String::from(entry)))
.into_value(&tag),
)
}
tagged_dict.into_value()
})
.collect();
Some(UntaggedValue::Table(rows).into_value(&tag))
}
async fn from_ssv(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = args.call_info.name_tag.clone();
let registry = registry.clone();
let (
FromSSVArgs {
headerless,
aligned_columns,
minimum_spaces,
},
input,
) = args.process(&registry).await?;
let concat_string = input.collect_string(name.clone()).await?;
let split_at = match minimum_spaces {
Some(number) => number.item,
None => DEFAULT_MINIMUM_SPACES,
};
Ok(
match from_ssv_string_to_value(
&concat_string.item,
headerless,
aligned_columns,
split_at,
name.clone(),
) {
Some(x) => match x {
Value {
value: UntaggedValue::Table(list),
..
} => futures::stream::iter(list.into_iter().map(ReturnSuccess::value))
.to_output_stream(),
x => OutputStream::one(ReturnSuccess::value(x)),
},
None => {
return Err(ShellError::labeled_error_with_secondary(
"Could not parse as SSV",
"input cannot be parsed ssv",
&name,
"value originates from here",
&concat_string.tag,
));
}
},
)
}
#[cfg(test)]
mod tests {
use super::*;
fn owned(x: &str, y: &str) -> (String, String) {
(String::from(x), String::from(y))
}
#[test]
fn it_trims_empty_and_whitespace_only_lines() {
let input = r#"
a b
1 2
3 4
"#;
let result = string_to_table(input, false, true, 1);
assert_eq!(
result,
vec![
vec![owned("a", "1"), owned("b", "2")],
vec![owned("a", "3"), owned("b", "4")]
]
);
}
#[test]
fn it_deals_with_single_column_input() {
let input = r#"
a
1
2
"#;
let result = string_to_table(input, false, true, 1);
assert_eq!(result, vec![vec![owned("a", "1")], vec![owned("a", "2")]]);
}
#[test]
fn it_uses_first_row_as_data_when_headerless() {
let input = r#"
a b
1 2
3 4
"#;
let result = string_to_table(input, true, true, 1);
assert_eq!(
result,
vec![
vec![owned("Column1", "a"), owned("Column2", "b")],
vec![owned("Column1", "1"), owned("Column2", "2")],
vec![owned("Column1", "3"), owned("Column2", "4")]
]
);
}
#[test]
fn it_allows_a_predefined_number_of_spaces() {
let input = r#"
column a column b
entry 1 entry number 2
3 four
"#;
let result = string_to_table(input, false, true, 3);
assert_eq!(
result,
vec![
vec![
owned("column a", "entry 1"),
owned("column b", "entry number 2")
],
vec![owned("column a", "3"), owned("column b", "four")]
]
);
}
#[test]
fn it_trims_remaining_separator_space() {
let input = r#"
colA colB colC
val1 val2 val3
"#;
let trimmed = |s: &str| s.trim() == s;
let result = string_to_table(input, false, true, 2);
assert!(result
.iter()
.all(|row| row.iter().all(|(a, b)| trimmed(a) && trimmed(b))));
}
#[test]
fn it_keeps_empty_columns() {
let input = r#"
colA col B col C
val2 val3
val4 val 5 val 6
val7 val8
"#;
let result = string_to_table(input, false, true, 2);
assert_eq!(
result,
vec![
vec![
owned("colA", ""),
owned("col B", "val2"),
owned("col C", "val3")
],
vec![
owned("colA", "val4"),
owned("col B", "val 5"),
owned("col C", "val 6")
],
vec![
owned("colA", "val7"),
owned("col B", ""),
owned("col C", "val8")
],
]
);
}
#[test]
fn it_can_produce_an_empty_stream_for_header_only_input() {
let input = "colA col B";
let result = string_to_table(input, false, true, 2);
let expected: Vec<Vec<(String, String)>> = vec![];
assert_eq!(expected, result);
}
#[test]
fn it_uses_the_full_final_column() {
let input = r#"
colA col B
val1 val2 trailing value that should be included
"#;
let result = string_to_table(input, false, true, 2);
assert_eq!(
result,
vec![vec![
owned("colA", "val1"),
owned("col B", "val2 trailing value that should be included"),
]]
);
}
#[test]
fn it_handles_empty_values_when_headerless_and_aligned_columns() {
let input = r#"
a multi-word value b d
1 3-3 4
last
"#;
let result = string_to_table(input, true, true, 2);
assert_eq!(
result,
vec![
vec![
owned("Column1", "a multi-word value"),
owned("Column2", "b"),
owned("Column3", ""),
owned("Column4", "d"),
owned("Column5", "")
],
vec![
owned("Column1", "1"),
owned("Column2", ""),
owned("Column3", "3-3"),
owned("Column4", "4"),
owned("Column5", "")
],
vec![
owned("Column1", ""),
owned("Column2", ""),
owned("Column3", ""),
owned("Column4", ""),
owned("Column5", "last")
],
]
);
}
#[test]
fn input_is_parsed_correctly_if_either_option_works() {
let input = r#"
docker-registry docker-registry=default docker-registry=default 172.30.78.158 5000/TCP
kubernetes component=apiserver,provider=kubernetes <none> 172.30.0.2 443/TCP
kubernetes-ro component=apiserver,provider=kubernetes <none> 172.30.0.1 80/TCP
"#;
let aligned_columns_headerless = string_to_table(input, true, true, 2);
let separator_headerless = string_to_table(input, true, false, 2);
let aligned_columns_with_headers = string_to_table(input, false, true, 2);
let separator_with_headers = string_to_table(input, false, false, 2);
assert_eq!(aligned_columns_headerless, separator_headerless);
assert_eq!(aligned_columns_with_headers, separator_with_headers);
}
#[test]
fn examples_work_as_expected() {
use super::FromSSV;
use crate::examples::test as test_examples;
test_examples(FromSSV {})
}
}

View File

@ -0,0 +1,110 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
pub struct FromTOML;
#[async_trait]
impl WholeStreamCommand for FromTOML {
fn name(&self) -> &str {
"from toml"
}
fn signature(&self) -> Signature {
Signature::build("from toml")
}
fn usage(&self) -> &str {
"Parse text as .toml and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_toml(args, registry).await
}
}
pub fn convert_toml_value_to_nu_value(v: &toml::Value, tag: impl Into<Tag>) -> Value {
let tag = tag.into();
match v {
toml::Value::Boolean(b) => UntaggedValue::boolean(*b).into_value(tag),
toml::Value::Integer(n) => UntaggedValue::int(*n).into_value(tag),
toml::Value::Float(n) => UntaggedValue::decimal(*n).into_value(tag),
toml::Value::String(s) => {
UntaggedValue::Primitive(Primitive::String(String::from(s))).into_value(tag)
}
toml::Value::Array(a) => UntaggedValue::Table(
a.iter()
.map(|x| convert_toml_value_to_nu_value(x, &tag))
.collect(),
)
.into_value(tag),
toml::Value::Datetime(dt) => {
UntaggedValue::Primitive(Primitive::String(dt.to_string())).into_value(tag)
}
toml::Value::Table(t) => {
let mut collected = TaggedDictBuilder::new(&tag);
for (k, v) in t.iter() {
collected.insert_value(k.clone(), convert_toml_value_to_nu_value(v, &tag));
}
collected.into_value()
}
}
}
pub fn from_toml_string_to_value(s: String, tag: impl Into<Tag>) -> Result<Value, toml::de::Error> {
let v: toml::Value = s.parse::<toml::Value>()?;
Ok(convert_toml_value_to_nu_value(&v, tag))
}
pub async fn from_toml(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let concat_string = input.collect_string(tag.clone()).await?;
Ok(
match from_toml_string_to_value(concat_string.item, tag.clone()) {
Ok(x) => match x {
Value {
value: UntaggedValue::Table(list),
..
} => futures::stream::iter(list.into_iter().map(ReturnSuccess::value))
.to_output_stream(),
x => OutputStream::one(ReturnSuccess::value(x)),
},
Err(_) => {
return Err(ShellError::labeled_error_with_secondary(
"Could not parse as TOML",
"input cannot be parsed as TOML",
&tag,
"value originates from here",
concat_string.tag,
))
}
},
)
}
#[cfg(test)]
mod tests {
use super::FromTOML;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromTOML {})
}
}

View File

@ -0,0 +1,62 @@
use crate::commands::from_delimited_data::from_delimited_data;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::Signature;
pub struct FromTSV;
#[derive(Deserialize)]
pub struct FromTSVArgs {
headerless: bool,
}
#[async_trait]
impl WholeStreamCommand for FromTSV {
fn name(&self) -> &str {
"from tsv"
}
fn signature(&self) -> Signature {
Signature::build("from tsv").switch(
"headerless",
"don't treat the first row as column names",
None,
)
}
fn usage(&self) -> &str {
"Parse text as .tsv and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_tsv(args, registry).await
}
}
async fn from_tsv(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (FromTSVArgs { headerless }, input) = args.process(&registry).await?;
from_delimited_data(headerless, '\t', "TSV", input, name).await
}
#[cfg(test)]
mod tests {
use super::FromTSV;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromTSV {})
}
}

View File

@ -0,0 +1,74 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue};
pub struct FromURL;
#[async_trait]
impl WholeStreamCommand for FromURL {
fn name(&self) -> &str {
"from url"
}
fn signature(&self) -> Signature {
Signature::build("from url")
}
fn usage(&self) -> &str {
"Parse url-encoded string as a table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_url(args, registry).await
}
}
async fn from_url(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let concat_string = input.collect_string(tag.clone()).await?;
let result = serde_urlencoded::from_str::<Vec<(String, String)>>(&concat_string.item);
match result {
Ok(result) => {
let mut row = TaggedDictBuilder::new(tag);
for (k, v) in result {
row.insert_untagged(k, UntaggedValue::string(v));
}
Ok(OutputStream::one(ReturnSuccess::value(row.into_value())))
}
_ => Err(ShellError::labeled_error_with_secondary(
"String not compatible with url-encoding",
"input not url-encoded",
tag,
"value originates from here",
concat_string.tag,
)),
}
}
#[cfg(test)]
mod tests {
use super::FromURL;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromURL {})
}
}

View File

@ -0,0 +1,123 @@
extern crate ical;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use ical::parser::vcard::component::*;
use ical::property::Property;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
use std::io::BufReader;
pub struct FromVcf;
#[async_trait]
impl WholeStreamCommand for FromVcf {
fn name(&self) -> &str {
"from vcf"
}
fn signature(&self) -> Signature {
Signature::build("from vcf")
}
fn usage(&self) -> &str {
"Parse text as .vcf and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_vcf(args, registry).await
}
}
async fn from_vcf(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let input_string = input.collect_string(tag.clone()).await?.item;
let input_bytes = input_string.as_bytes();
let buf_reader = BufReader::new(input_bytes);
let parser = ical::VcardParser::new(buf_reader);
let mut values_vec_deque = VecDeque::new();
for contact in parser {
match contact {
Ok(c) => {
values_vec_deque.push_back(ReturnSuccess::value(contact_to_value(c, tag.clone())))
}
Err(_) => {
return Err(ShellError::labeled_error(
"Could not parse as .vcf",
"input cannot be parsed as .vcf",
tag.clone(),
))
}
}
}
Ok(futures::stream::iter(values_vec_deque).to_output_stream())
}
fn contact_to_value(contact: VcardContact, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged("properties", properties_to_value(contact.properties, tag));
row.into_value()
}
fn properties_to_value(properties: Vec<Property>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&properties
.into_iter()
.map(|prop| {
let mut row = TaggedDictBuilder::new(tag.clone());
let name = UntaggedValue::string(prop.name);
let value = match prop.value {
Some(val) => UntaggedValue::string(val),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
let params = match prop.params {
Some(param_list) => params_to_value(param_list, tag.clone()).into(),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
row.insert_untagged("name", name);
row.insert_untagged("value", value);
row.insert_untagged("params", params);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn params_to_value(params: Vec<(String, Vec<String>)>, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag);
for (param_name, param_values) in params {
let values: Vec<Value> = param_values.into_iter().map(|val| val.into()).collect();
let values = UntaggedValue::table(&values);
row.insert_untagged(param_name, values);
}
row.into_value()
}
#[cfg(test)]
mod tests {
use super::FromVcf;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromVcf {})
}
}

View File

@ -0,0 +1,111 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::TaggedListBuilder;
use calamine::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue};
use std::io::Cursor;
pub struct FromXLSX;
#[derive(Deserialize)]
pub struct FromXLSXArgs {
headerless: bool,
}
#[async_trait]
impl WholeStreamCommand for FromXLSX {
fn name(&self) -> &str {
"from xlsx"
}
fn signature(&self) -> Signature {
Signature::build("from xlsx").switch(
"headerless",
"don't treat the first row as column names",
None,
)
}
fn usage(&self) -> &str {
"Parse binary Excel(.xlsx) data and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_xlsx(args, registry).await
}
}
async fn from_xlsx(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let registry = registry.clone();
let (
FromXLSXArgs {
headerless: _headerless,
},
input,
) = args.process(&registry).await?;
let value = input.collect_binary(tag.clone()).await?;
let buf: Cursor<Vec<u8>> = Cursor::new(value.item);
let mut xls = Xlsx::<_>::new(buf).map_err(|_| {
ShellError::labeled_error("Could not load xlsx file", "could not load xlsx file", &tag)
})?;
let mut dict = TaggedDictBuilder::new(&tag);
let sheet_names = xls.sheet_names().to_owned();
for sheet_name in &sheet_names {
let mut sheet_output = TaggedListBuilder::new(&tag);
if let Some(Ok(current_sheet)) = xls.worksheet_range(sheet_name) {
for row in current_sheet.rows() {
let mut row_output = TaggedDictBuilder::new(&tag);
for (i, cell) in row.iter().enumerate() {
let value = match cell {
DataType::Empty => UntaggedValue::nothing(),
DataType::String(s) => UntaggedValue::string(s),
DataType::Float(f) => UntaggedValue::decimal(*f),
DataType::Int(i) => UntaggedValue::int(*i),
DataType::Bool(b) => UntaggedValue::boolean(*b),
_ => UntaggedValue::nothing(),
};
row_output.insert_untagged(&format!("Column{}", i), value);
}
sheet_output.push_untagged(row_output.into_untagged_value());
}
dict.insert_untagged(sheet_name, sheet_output.into_untagged_value());
} else {
return Err(ShellError::labeled_error(
"Could not load sheet",
"could not load sheet",
&tag,
));
}
}
Ok(OutputStream::one(ReturnSuccess::value(dict.into_value())))
}
#[cfg(test)]
mod tests {
use super::FromXLSX;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromXLSX {})
}
}

View File

@ -0,0 +1,314 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
pub struct FromXML;
#[async_trait]
impl WholeStreamCommand for FromXML {
fn name(&self) -> &str {
"from xml"
}
fn signature(&self) -> Signature {
Signature::build("from xml")
}
fn usage(&self) -> &str {
"Parse text as .xml and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_xml(args, registry).await
}
}
fn from_attributes_to_value(attributes: &[roxmltree::Attribute], tag: impl Into<Tag>) -> Value {
let tag = tag.into();
let mut collected = TaggedDictBuilder::new(tag);
for a in attributes {
collected.insert_untagged(String::from(a.name()), UntaggedValue::string(a.value()));
}
collected.into_value()
}
fn from_node_to_value<'a, 'd>(n: &roxmltree::Node<'a, 'd>, tag: impl Into<Tag>) -> Value {
let tag = tag.into();
if n.is_element() {
let name = n.tag_name().name().trim().to_string();
let mut children_values = vec![];
for c in n.children() {
children_values.push(from_node_to_value(&c, &tag));
}
let children_values: Vec<Value> = children_values
.into_iter()
.filter(|x| match x {
Value {
value: UntaggedValue::Primitive(Primitive::String(f)),
..
} => {
!f.trim().is_empty() // non-whitespace characters?
}
_ => true,
})
.collect();
let mut collected = TaggedDictBuilder::new(&tag);
let attribute_value: Value = from_attributes_to_value(&n.attributes(), &tag);
let mut row = TaggedDictBuilder::new(&tag);
row.insert_untagged(
String::from("children"),
UntaggedValue::Table(children_values),
);
row.insert_untagged(String::from("attributes"), attribute_value);
collected.insert_untagged(name, row.into_value());
collected.into_value()
} else if n.is_comment() {
UntaggedValue::string("<comment>").into_value(tag)
} else if n.is_pi() {
UntaggedValue::string("<processing_instruction>").into_value(tag)
} else if n.is_text() {
match n.text() {
Some(text) => UntaggedValue::string(text).into_value(tag),
None => UntaggedValue::string("<error>").into_value(tag),
}
} else {
UntaggedValue::string("<unknown>").into_value(tag)
}
}
fn from_document_to_value(d: &roxmltree::Document, tag: impl Into<Tag>) -> Value {
from_node_to_value(&d.root_element(), tag)
}
pub fn from_xml_string_to_value(s: String, tag: impl Into<Tag>) -> Result<Value, roxmltree::Error> {
let parsed = roxmltree::Document::parse(&s)?;
Ok(from_document_to_value(&parsed, tag))
}
async fn from_xml(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let concat_string = input.collect_string(tag.clone()).await?;
Ok(
match from_xml_string_to_value(concat_string.item, tag.clone()) {
Ok(x) => match x {
Value {
value: UntaggedValue::Table(list),
..
} => futures::stream::iter(list.into_iter().map(ReturnSuccess::value))
.to_output_stream(),
x => OutputStream::one(ReturnSuccess::value(x)),
},
Err(_) => {
return Err(ShellError::labeled_error_with_secondary(
"Could not parse as XML",
"input cannot be parsed as XML",
&tag,
"value originates from here",
&concat_string.tag,
))
}
},
)
}
#[cfg(test)]
mod tests {
use crate::commands::from_xml;
use indexmap::IndexMap;
use nu_protocol::{UntaggedValue, Value};
use nu_source::*;
fn string(input: impl Into<String>) -> Value {
UntaggedValue::string(input.into()).into_untagged_value()
}
fn row(entries: IndexMap<String, Value>) -> Value {
UntaggedValue::row(entries).into_untagged_value()
}
fn table(list: &[Value]) -> Value {
UntaggedValue::table(list).into_untagged_value()
}
fn parse(xml: &str) -> Result<Value, roxmltree::Error> {
from_xml::from_xml_string_to_value(xml.to_string(), Tag::unknown())
}
#[test]
fn parses_empty_element() -> Result<(), roxmltree::Error> {
let source = "<nu></nu>";
assert_eq!(
parse(source)?,
row(indexmap! {
"nu".into() => row(indexmap! {
"children".into() => table(&[]),
"attributes".into() => row(indexmap! {})
})
})
);
Ok(())
}
#[test]
fn parses_element_with_text() -> Result<(), roxmltree::Error> {
let source = "<nu>La era de los tres caballeros</nu>";
assert_eq!(
parse(source)?,
row(indexmap! {
"nu".into() => row(indexmap! {
"children".into() => table(&[string("La era de los tres caballeros")]),
"attributes".into() => row(indexmap! {})
})
})
);
Ok(())
}
#[test]
fn parses_element_with_elements() -> Result<(), roxmltree::Error> {
let source = "\
<nu>
<dev>Andrés</dev>
<dev>Jonathan</dev>
<dev>Yehuda</dev>
</nu>";
assert_eq!(
parse(source)?,
row(indexmap! {
"nu".into() => row(indexmap! {
"children".into() => table(&[
row(indexmap! {
"dev".into() => row(indexmap! {
"children".into() => table(&[string("Andrés")]),
"attributes".into() => row(indexmap! {})
})
}),
row(indexmap! {
"dev".into() => row(indexmap! {
"children".into() => table(&[string("Jonathan")]),
"attributes".into() => row(indexmap! {})
})
}),
row(indexmap! {
"dev".into() => row(indexmap! {
"children".into() => table(&[string("Yehuda")]),
"attributes".into() => row(indexmap! {})
})
})
]),
"attributes".into() => row(indexmap! {})
})
})
);
Ok(())
}
#[test]
fn parses_element_with_attribute() -> Result<(), roxmltree::Error> {
let source = "\
<nu version=\"2.0\">
</nu>";
assert_eq!(
parse(source)?,
row(indexmap! {
"nu".into() => row(indexmap! {
"children".into() => table(&[]),
"attributes".into() => row(indexmap! {
"version".into() => string("2.0")
})
})
})
);
Ok(())
}
#[test]
fn parses_element_with_attribute_and_element() -> Result<(), roxmltree::Error> {
let source = "\
<nu version=\"2.0\">
<version>2.0</version>
</nu>";
assert_eq!(
parse(source)?,
row(indexmap! {
"nu".into() => row(indexmap! {
"children".into() => table(&[
row(indexmap! {
"version".into() => row(indexmap! {
"children".into() => table(&[string("2.0")]),
"attributes".into() => row(indexmap! {})
})
})
]),
"attributes".into() => row(indexmap! {
"version".into() => string("2.0")
})
})
})
);
Ok(())
}
#[test]
fn parses_element_with_multiple_attributes() -> Result<(), roxmltree::Error> {
let source = "\
<nu version=\"2.0\" age=\"25\">
</nu>";
assert_eq!(
parse(source)?,
row(indexmap! {
"nu".into() => row(indexmap! {
"children".into() => table(&[]),
"attributes".into() => row(indexmap! {
"version".into() => string("2.0"),
"age".into() => string("25")
})
})
})
);
Ok(())
}
#[test]
fn examples_work_as_expected() {
use super::FromXML;
use crate::examples::test as test_examples;
test_examples(FromXML {})
}
}

View File

@ -0,0 +1,218 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, Signature, TaggedDictBuilder, UntaggedValue, Value};
pub struct FromYAML;
#[async_trait]
impl WholeStreamCommand for FromYAML {
fn name(&self) -> &str {
"from yaml"
}
fn signature(&self) -> Signature {
Signature::build("from yaml")
}
fn usage(&self) -> &str {
"Parse text as .yaml/.yml and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_yaml(args, registry).await
}
}
pub struct FromYML;
#[async_trait]
impl WholeStreamCommand for FromYML {
fn name(&self) -> &str {
"from yml"
}
fn signature(&self) -> Signature {
Signature::build("from yml")
}
fn usage(&self) -> &str {
"Parse text as .yaml/.yml and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_yaml(args, registry).await
}
}
fn convert_yaml_value_to_nu_value(
v: &serde_yaml::Value,
tag: impl Into<Tag>,
) -> Result<Value, ShellError> {
let tag = tag.into();
let err_not_compatible_number = ShellError::labeled_error(
"Expected a compatible number",
"expected a compatible number",
&tag,
);
Ok(match v {
serde_yaml::Value::Bool(b) => UntaggedValue::boolean(*b).into_value(tag),
serde_yaml::Value::Number(n) if n.is_i64() => {
UntaggedValue::int(n.as_i64().ok_or_else(|| err_not_compatible_number)?).into_value(tag)
}
serde_yaml::Value::Number(n) if n.is_f64() => {
UntaggedValue::decimal(n.as_f64().ok_or_else(|| err_not_compatible_number)?)
.into_value(tag)
}
serde_yaml::Value::String(s) => UntaggedValue::string(s).into_value(tag),
serde_yaml::Value::Sequence(a) => {
let result: Result<Vec<Value>, ShellError> = a
.iter()
.map(|x| convert_yaml_value_to_nu_value(x, &tag))
.collect();
UntaggedValue::Table(result?).into_value(tag)
}
serde_yaml::Value::Mapping(t) => {
let mut collected = TaggedDictBuilder::new(&tag);
for (k, v) in t.iter() {
// A ShellError that we re-use multiple times in the Mapping scenario
let err_unexpected_map = ShellError::labeled_error(
format!("Unexpected YAML:\nKey: {:?}\nValue: {:?}", k, v),
"unexpected",
tag.clone(),
);
match (k, v) {
(serde_yaml::Value::String(k), _) => {
collected.insert_value(k.clone(), convert_yaml_value_to_nu_value(v, &tag)?);
}
// Hard-code fix for cases where "v" is a string without quotations with double curly braces
// e.g. k = value
// value: {{ something }}
// Strangely, serde_yaml returns
// "value" -> Mapping(Mapping { map: {Mapping(Mapping { map: {String("something"): Null} }): Null} })
(serde_yaml::Value::Mapping(m), serde_yaml::Value::Null) => {
return m
.iter()
.take(1)
.collect_vec()
.first()
.and_then(|e| match e {
(serde_yaml::Value::String(s), serde_yaml::Value::Null) => Some(
UntaggedValue::string("{{ ".to_owned() + &s + " }}")
.into_value(tag),
),
_ => None,
})
.ok_or(err_unexpected_map);
}
(_, _) => {
return Err(err_unexpected_map);
}
}
}
collected.into_value()
}
serde_yaml::Value::Null => UntaggedValue::Primitive(Primitive::Nothing).into_value(tag),
x => unimplemented!("Unsupported yaml case: {:?}", x),
})
}
pub fn from_yaml_string_to_value(s: String, tag: impl Into<Tag>) -> Result<Value, ShellError> {
let tag = tag.into();
let v: serde_yaml::Value = serde_yaml::from_str(&s).map_err(|x| {
ShellError::labeled_error(
format!("Could not load yaml: {}", x),
"could not load yaml from text",
&tag,
)
})?;
Ok(convert_yaml_value_to_nu_value(&v, tag)?)
}
async fn from_yaml(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let concat_string = input.collect_string(tag.clone()).await?;
match from_yaml_string_to_value(concat_string.item, tag.clone()) {
Ok(x) => match x {
Value {
value: UntaggedValue::Table(list),
..
} => Ok(futures::stream::iter(list).to_output_stream()),
x => Ok(OutputStream::one(x)),
},
Err(_) => Err(ShellError::labeled_error_with_secondary(
"Could not parse as YAML",
"input cannot be parsed as YAML",
&tag,
"value originates from here",
&concat_string.tag,
)),
}
}
#[cfg(test)]
mod tests {
use super::*;
use nu_plugin::row;
use nu_plugin::test_helpers::value::string;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromYAML {})
}
#[test]
fn test_problematic_yaml() {
struct TestCase {
description: &'static str,
input: &'static str,
expected: Result<Value, ShellError>,
}
let tt: Vec<TestCase> = vec![
TestCase {
description: "Double Curly Braces With Quotes",
input: r#"value: "{{ something }}""#,
expected: Ok(row!["value".to_owned() => string("{{ something }}")]),
},
TestCase {
description: "Double Curly Braces Without Quotes",
input: r#"value: {{ something }}"#,
expected: Ok(row!["value".to_owned() => string("{{ something }}")]),
},
];
for tc in tt.into_iter() {
let actual = from_yaml_string_to_value(tc.input.to_owned(), Tag::default());
if actual.is_err() {
assert!(
tc.expected.is_err(),
"actual is Err for test:\nTest Description {}\nErr: {:?}",
tc.description,
actual
);
} else {
assert_eq!(actual, tc.expected, "{}", tc.description);
}
}
}
}

View File

@ -0,0 +1,263 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use indexmap::set::IndexSet;
use log::trace;
use nu_errors::ShellError;
use nu_protocol::{
did_you_mean, ColumnPath, PathMember, Primitive, ReturnSuccess, Signature, SyntaxShape,
UnspannedPathMember, UntaggedValue, Value,
};
use nu_source::span_for_spanned_list;
use nu_value_ext::get_data_by_column_path;
pub struct Get;
#[derive(Deserialize)]
pub struct GetArgs {
rest: Vec<ColumnPath>,
}
#[async_trait]
impl WholeStreamCommand for Get {
fn name(&self) -> &str {
"get"
}
fn signature(&self) -> Signature {
Signature::build("get").rest(
SyntaxShape::ColumnPath,
"optionally return additional data by path",
)
}
fn usage(&self) -> &str {
"Open given cells as text."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
get(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Extract the name of files as a list",
example: "ls | get name",
result: None,
},
Example {
description: "Extract the cpu list from the sys information",
example: "sys | get cpu",
result: None,
},
]
}
}
pub fn get_column_path(path: &ColumnPath, obj: &Value) -> Result<Value, ShellError> {
let fields = path.clone();
get_data_by_column_path(
obj,
path,
Box::new(move |(obj_source, column_path_tried, error)| {
let path_members_span = span_for_spanned_list(fields.members().iter().map(|p| p.span));
match &obj_source.value {
UntaggedValue::Table(rows) => match column_path_tried {
PathMember {
unspanned: UnspannedPathMember::String(column),
..
} => {
let primary_label = format!("There isn't a column named '{}'", &column);
let suggestions: IndexSet<_> = rows
.iter()
.filter_map(|r| did_you_mean(&r, &column_path_tried))
.map(|s| s[0].1.to_owned())
.collect();
let mut existing_columns: IndexSet<_> = IndexSet::default();
let mut names: Vec<String> = vec![];
for row in rows {
for field in row.data_descriptors() {
if !existing_columns.contains(&field[..]) {
existing_columns.insert(field.clone());
names.push(field);
}
}
}
if names.is_empty() {
return ShellError::labeled_error_with_secondary(
"Unknown column",
primary_label,
column_path_tried.span,
"Appears to contain rows. Try indexing instead.",
column_path_tried.span.since(path_members_span),
);
} else {
return ShellError::labeled_error_with_secondary(
"Unknown column",
primary_label,
column_path_tried.span,
format!(
"Perhaps you meant '{}'? Columns available: {}",
suggestions
.iter()
.map(|x| x.to_owned())
.collect::<Vec<String>>()
.join(","),
names.join(",")
),
column_path_tried.span.since(path_members_span),
);
};
}
PathMember {
unspanned: UnspannedPathMember::Int(idx),
..
} => {
let total = rows.len();
let secondary_label = if total == 1 {
"The table only has 1 row".to_owned()
} else {
format!("The table only has {} rows (0 to {})", total, total - 1)
};
return ShellError::labeled_error_with_secondary(
"Row not found",
format!("There isn't a row indexed at {}", idx),
column_path_tried.span,
secondary_label,
column_path_tried.span.since(path_members_span),
);
}
},
UntaggedValue::Row(columns) => match column_path_tried {
PathMember {
unspanned: UnspannedPathMember::String(column),
..
} => {
let primary_label = format!("There isn't a column named '{}'", &column);
if let Some(suggestions) = did_you_mean(&obj_source, column_path_tried) {
return ShellError::labeled_error_with_secondary(
"Unknown column",
primary_label,
column_path_tried.span,
format!(
"Perhaps you meant '{}'? Columns available: {}",
suggestions[0].1,
&obj_source.data_descriptors().join(",")
),
column_path_tried.span.since(path_members_span),
);
}
}
PathMember {
unspanned: UnspannedPathMember::Int(idx),
..
} => {
return ShellError::labeled_error_with_secondary(
"No rows available",
format!("A row at '{}' can't be indexed.", &idx),
column_path_tried.span,
format!(
"Appears to contain columns. Columns available: {}",
columns.keys().join(",")
),
column_path_tried.span.since(path_members_span),
)
}
},
_ => {}
}
if let Some(suggestions) = did_you_mean(&obj_source, column_path_tried) {
return ShellError::labeled_error(
"Unknown column",
format!("did you mean '{}'?", suggestions[0].1),
column_path_tried.span.since(path_members_span),
);
}
error
}),
)
}
pub async fn get(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (GetArgs { rest: mut fields }, mut input) = args.process(&registry).await?;
if fields.is_empty() {
let vec = input.drain_vec().await;
let descs = nu_protocol::merge_descriptors(&vec);
Ok(futures::stream::iter(descs.into_iter().map(ReturnSuccess::value)).to_output_stream())
} else {
let member = fields.remove(0);
trace!("get {:?} {:?}", member, fields);
Ok(input
.map(move |item| {
let member = vec![member.clone()];
let column_paths = vec![&member, &fields]
.into_iter()
.flatten()
.collect::<Vec<&ColumnPath>>();
let mut output = vec![];
for path in column_paths {
let res = get_column_path(&path, &item);
match res {
Ok(got) => match got {
Value {
value: UntaggedValue::Table(rows),
..
} => {
for item in rows {
output.push(ReturnSuccess::value(item.clone()));
}
}
Value {
value: UntaggedValue::Primitive(Primitive::Nothing),
..
} => {}
other => output.push(ReturnSuccess::value(other.clone())),
},
Err(reason) => output.push(ReturnSuccess::value(
UntaggedValue::Error(reason).into_untagged_value(),
)),
}
}
futures::stream::iter(output)
})
.flatten()
.to_output_stream())
}
}
#[cfg(test)]
mod tests {
use super::Get;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Get {})
}
}

View File

@ -0,0 +1,281 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use indexmap::indexmap;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use nu_value_ext::as_string;
pub struct GroupBy;
#[derive(Deserialize)]
pub struct GroupByArgs {
column_name: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for GroupBy {
fn name(&self) -> &str {
"group-by"
}
fn signature(&self) -> Signature {
Signature::build("group-by").optional(
"column_name",
SyntaxShape::String,
"the name of the column to group by",
)
}
fn usage(&self) -> &str {
"Creates a new table with the data from the table rows grouped by the column given."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
group_by(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Group items by type",
example: r#"ls | group-by type"#,
result: None,
},
Example {
description: "Group items by their value",
example: "echo [1 3 1 3 2 1 1] | group-by",
result: Some(vec![UntaggedValue::row(indexmap! {
"1".to_string() => UntaggedValue::Table(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(1).into(),
UntaggedValue::int(1).into(),
UntaggedValue::int(1).into(),
]).into(),
"3".to_string() => UntaggedValue::Table(vec![
UntaggedValue::int(3).into(),
UntaggedValue::int(3).into(),
]).into(),
"2".to_string() => UntaggedValue::Table(vec![
UntaggedValue::int(2).into(),
]).into(),
})
.into()]),
},
]
}
}
enum Grouper {
ByColumn(Option<Tagged<String>>),
}
pub async fn group_by(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (GroupByArgs { column_name }, input) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await;
if values.is_empty() {
return Err(ShellError::labeled_error(
"Expected table from pipeline",
"requires a table input",
name,
));
}
let values = UntaggedValue::table(&values).into_value(&name);
match group(&column_name, &values, name) {
Ok(grouped) => Ok(OutputStream::one(ReturnSuccess::value(grouped))),
Err(reason) => Err(reason),
}
}
pub fn suggestions(tried: Tagged<&str>, for_value: &Value) -> ShellError {
let possibilities = for_value.data_descriptors();
let mut possible_matches: Vec<_> = possibilities
.iter()
.map(|x| (natural::distance::levenshtein_distance(x, &tried), x))
.collect();
possible_matches.sort();
if !possible_matches.is_empty() {
ShellError::labeled_error(
"Unknown column",
format!("did you mean '{}'?", possible_matches[0].1),
tried.tag(),
)
} else {
ShellError::labeled_error(
"Unknown column",
"row does not contain this column",
tried.tag(),
)
}
}
pub fn group(
column_name: &Option<Tagged<String>>,
values: &Value,
tag: impl Into<Tag>,
) -> Result<Value, ShellError> {
let name = tag.into();
let grouper = if let Some(column_name) = column_name {
Grouper::ByColumn(Some(column_name.clone()))
} else {
Grouper::ByColumn(None)
};
match grouper {
Grouper::ByColumn(Some(column_name)) => {
let block = Box::new(move |row: &Value| {
match row.get_data_by_key(column_name.borrow_spanned()) {
Some(group_key) => Ok(as_string(&group_key)?),
None => Err(suggestions(column_name.borrow_tagged(), &row)),
}
});
crate::utils::data::group(&values, &Some(block), &name)
}
Grouper::ByColumn(None) => {
let block = Box::new(move |row: &Value| match as_string(row) {
Ok(group_key) => Ok(group_key),
Err(reason) => Err(reason),
});
crate::utils::data::group(&values, &Some(block), &name)
}
}
}
#[cfg(test)]
mod tests {
use super::group;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_protocol::{UntaggedValue, Value};
use nu_source::*;
fn string(input: impl Into<String>) -> Value {
UntaggedValue::string(input.into()).into_untagged_value()
}
fn row(entries: IndexMap<String, Value>) -> Value {
UntaggedValue::row(entries).into_untagged_value()
}
fn table(list: &[Value]) -> Value {
UntaggedValue::table(list).into_untagged_value()
}
fn nu_releases_committers() -> Vec<Value> {
vec![
row(
indexmap! {"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("August 23-2019")},
),
row(
indexmap! {"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("August 23-2019")},
),
row(
indexmap! {"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("October 10-2019")},
),
row(
indexmap! {"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("Sept 24-2019")},
),
row(
indexmap! {"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("October 10-2019")},
),
row(
indexmap! {"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("Sept 24-2019")},
),
row(
indexmap! {"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("October 10-2019")},
),
row(
indexmap! {"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("Sept 24-2019")},
),
row(
indexmap! {"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("August 23-2019")},
),
]
}
#[test]
fn groups_table_by_date_column() -> Result<(), ShellError> {
let for_key = Some(String::from("date").tagged_unknown());
let sample = table(&nu_releases_committers());
assert_eq!(
group(&for_key, &sample, Tag::unknown())?,
row(indexmap! {
"August 23-2019".into() => table(&[
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("August 23-2019")}),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("August 23-2019")}),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("August 23-2019")})
]),
"October 10-2019".into() => table(&[
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("October 10-2019")}),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("October 10-2019")}),
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("October 10-2019")})
]),
"Sept 24-2019".into() => table(&[
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("Sept 24-2019")}),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("Sept 24-2019")}),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("Sept 24-2019")})
]),
})
);
Ok(())
}
#[test]
fn groups_table_by_country_column() -> Result<(), ShellError> {
let for_key = Some(String::from("country").tagged_unknown());
let sample = table(&nu_releases_committers());
assert_eq!(
group(&for_key, &sample, Tag::unknown())?,
row(indexmap! {
"EC".into() => table(&[
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("August 23-2019")}),
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("Sept 24-2019")}),
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("October 10-2019")})
]),
"NZ".into() => table(&[
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("August 23-2019")}),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("October 10-2019")}),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("Sept 24-2019")})
]),
"US".into() => table(&[
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("October 10-2019")}),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("Sept 24-2019")}),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("August 23-2019")}),
]),
})
);
Ok(())
}
#[test]
fn examples_work_as_expected() {
use super::GroupBy;
use crate::examples::test as test_examples;
test_examples(GroupBy {})
}
}

View File

@ -0,0 +1,187 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct GroupByDate;
#[derive(Deserialize)]
pub struct GroupByDateArgs {
column_name: Option<Tagged<String>>,
format: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for GroupByDate {
fn name(&self) -> &str {
"group-by date"
}
fn signature(&self) -> Signature {
Signature::build("group-by date")
.optional(
"column_name",
SyntaxShape::String,
"the name of the column to group by",
)
.named(
"format",
SyntaxShape::String,
"Specify date and time formatting",
Some('f'),
)
}
fn usage(&self) -> &str {
"Creates a new table with the data from the table rows grouped by the column given."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
group_by_date(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Group files by type",
example: "ls | group-by date --format '%d/%m/%Y'",
result: None,
}]
}
}
enum Grouper {
ByDate(Option<Tagged<String>>),
}
enum GroupByColumn {
Name(Option<Tagged<String>>),
}
pub async fn group_by_date(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (
GroupByDateArgs {
column_name,
format,
},
input,
) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await;
if values.is_empty() {
Err(ShellError::labeled_error(
"Expected table from pipeline",
"requires a table input",
name,
))
} else {
let values = UntaggedValue::table(&values).into_value(&name);
let grouper_column = if let Some(column_name) = column_name {
GroupByColumn::Name(Some(column_name))
} else {
GroupByColumn::Name(None)
};
let grouper_date = if let Some(date_format) = format {
Grouper::ByDate(Some(date_format))
} else {
Grouper::ByDate(None)
};
match (grouper_date, grouper_column) {
(Grouper::ByDate(None), GroupByColumn::Name(None)) => {
let block = Box::new(move |row: &Value| row.format("%Y-%b-%d"));
match crate::utils::data::group(&values, &Some(block), &name) {
Ok(grouped) => Ok(OutputStream::one(ReturnSuccess::value(grouped))),
Err(err) => Err(err),
}
}
(Grouper::ByDate(None), GroupByColumn::Name(Some(column_name))) => {
let block = Box::new(move |row: &Value| {
let group_key = match row.get_data_by_key(column_name.borrow_spanned()) {
Some(group_key) => Ok(group_key),
None => Err(suggestions(column_name.borrow_tagged(), &row)),
};
group_key?.format("%Y-%b-%d")
});
match crate::utils::data::group(&values, &Some(block), &name) {
Ok(grouped) => Ok(OutputStream::one(ReturnSuccess::value(grouped))),
Err(err) => Err(err),
}
}
(Grouper::ByDate(Some(fmt)), GroupByColumn::Name(None)) => {
let block = Box::new(move |row: &Value| row.format(&fmt));
match crate::utils::data::group(&values, &Some(block), &name) {
Ok(grouped) => Ok(OutputStream::one(ReturnSuccess::value(grouped))),
Err(err) => Err(err),
}
}
(Grouper::ByDate(Some(fmt)), GroupByColumn::Name(Some(column_name))) => {
let block = Box::new(move |row: &Value| {
let group_key = match row.get_data_by_key(column_name.borrow_spanned()) {
Some(group_key) => Ok(group_key),
None => Err(suggestions(column_name.borrow_tagged(), &row)),
};
group_key?.format(&fmt)
});
match crate::utils::data::group(&values, &Some(block), &name) {
Ok(grouped) => Ok(OutputStream::one(ReturnSuccess::value(grouped))),
Err(err) => Err(err),
}
}
}
}
}
pub fn suggestions(tried: Tagged<&str>, for_value: &Value) -> ShellError {
let possibilities = for_value.data_descriptors();
let mut possible_matches: Vec<_> = possibilities
.iter()
.map(|x| (natural::distance::levenshtein_distance(x, &tried), x))
.collect();
possible_matches.sort();
if !possible_matches.is_empty() {
ShellError::labeled_error(
"Unknown column",
format!("did you mean '{}'?", possible_matches[0].1),
tried.tag(),
)
} else {
ShellError::labeled_error(
"Unknown column",
"row does not contain this column",
tried.tag(),
)
}
}
#[cfg(test)]
mod tests {
use super::GroupByDate;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(GroupByDate {})
}
}

View File

@ -0,0 +1,114 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use futures::stream::StreamExt;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_protocol::Dictionary;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue, Value};
pub struct Headers;
#[async_trait]
impl WholeStreamCommand for Headers {
fn name(&self) -> &str {
"headers"
}
fn signature(&self) -> Signature {
Signature::build("headers")
}
fn usage(&self) -> &str {
"Use the first row of the table as column names"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
headers(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Create headers for a raw string",
example: r#"echo "a b c|1 2 3" | split row "|" | split column " " | headers"#,
result: None,
}]
}
}
pub async fn headers(
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let input = args.input;
let rows: Vec<Value> = input.collect().await;
if rows.is_empty() {
return Err(ShellError::untagged_runtime_error(
"Couldn't find headers, was the input a properly formatted, non-empty table?",
));
}
//the headers are the first row in the table
let headers: Vec<String> = match &rows[0].value {
UntaggedValue::Row(d) => {
Ok(d.entries
.iter()
.map(|(k, v)| {
match v.as_string() {
Ok(s) => s,
Err(_) => {
//If a cell that should contain a header name is empty, we name the column Column[index]
match d.entries.get_full(k) {
Some((index, _, _)) => format!("Column{}", index),
None => "unknownColumn".to_string(),
}
}
}
})
.collect())
}
_ => Err(ShellError::unexpected_eof(
"Could not get headers, is the table empty?",
rows[0].tag.span,
)),
}?;
Ok(
futures::stream::iter(rows.into_iter().skip(1).map(move |r| {
//Each row is a dictionary with the headers as keys
match &r.value {
UntaggedValue::Row(d) => {
let mut entries = IndexMap::new();
for (i, (_, v)) in d.entries.iter().enumerate() {
entries.insert(headers[i].clone(), v.clone());
}
Ok(ReturnSuccess::Value(
UntaggedValue::Row(Dictionary { entries }).into_value(r.tag.clone()),
))
}
_ => Err(ShellError::unexpected_eof(
"Couldn't iterate through rows, was the input a properly formatted table?",
r.tag.span,
)),
}
}))
.to_output_stream(),
)
}
#[cfg(test)]
mod tests {
use super::Headers;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Headers {})
}
}

View File

@ -0,0 +1,324 @@
use crate::commands::WholeStreamCommand;
use crate::data::command_dict;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{
NamedType, PositionalType, ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder,
UntaggedValue,
};
use nu_source::{SpannedItem, Tagged};
use nu_value_ext::get_data_by_key;
pub struct Help;
#[derive(Deserialize)]
pub struct HelpArgs {
rest: Vec<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for Help {
fn name(&self) -> &str {
"help"
}
fn signature(&self) -> Signature {
Signature::build("help").rest(SyntaxShape::String, "the name of command to get help on")
}
fn usage(&self) -> &str {
"Display help information about commands."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
help(args, registry).await
}
}
async fn help(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (HelpArgs { rest }, ..) = args.process(&registry).await?;
if !rest.is_empty() {
if rest[0].item == "commands" {
let mut sorted_names = registry.names();
sorted_names.sort();
Ok(
futures::stream::iter(sorted_names.into_iter().filter_map(move |cmd| {
// If it's a subcommand, don't list it during the commands list
if cmd.contains(' ') {
return None;
}
let mut short_desc = TaggedDictBuilder::new(name.clone());
let document_tag = rest[0].tag.clone();
let value = command_dict(
match registry.get_command(&cmd).ok_or_else(|| {
ShellError::labeled_error(
format!("Could not load {}", cmd),
"could not load command",
document_tag,
)
}) {
Ok(ok) => ok,
Err(err) => return Some(Err(err)),
},
name.clone(),
);
short_desc.insert_untagged("name", cmd);
short_desc.insert_untagged(
"description",
match match get_data_by_key(&value, "usage".spanned_unknown()).ok_or_else(
|| {
ShellError::labeled_error(
"Expected a usage key",
"expected a 'usage' key",
&value.tag,
)
},
) {
Ok(ok) => ok,
Err(err) => return Some(Err(err)),
}
.as_string()
{
Ok(ok) => ok,
Err(err) => return Some(Err(err)),
},
);
Some(ReturnSuccess::value(short_desc.into_value()))
}))
.to_output_stream(),
)
} else if rest.len() == 2 {
// Check for a subcommand
let command_name = format!("{} {}", rest[0].item, rest[1].item);
if let Some(command) = registry.get_command(&command_name) {
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(get_help(command.stream_command(), &registry))
.into_value(Tag::unknown()),
)))
} else {
Ok(OutputStream::empty())
}
} else if let Some(command) = registry.get_command(&rest[0].item) {
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(get_help(command.stream_command(), &registry))
.into_value(Tag::unknown()),
)))
} else {
Err(ShellError::labeled_error(
"Can't find command (use 'help commands' for full list)",
"can't find command",
rest[0].tag.span,
))
}
} else {
let msg = r#"Welcome to Nushell.
Here are some tips to help you get started.
* help commands - list all available commands
* help <command name> - display help about a particular command
Nushell works on the idea of a "pipeline". Pipelines are commands connected with the '|' character.
Each stage in the pipeline works together to load, parse, and display information to you.
[Examples]
List the files in the current directory, sorted by size:
ls | sort-by size
Get information about the current system:
sys | get host
Get the processes on your system actively using CPU:
ps | where cpu > 0
You can also learn more at https://www.nushell.sh/book/"#;
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(msg).into_value(Tag::unknown()),
)))
}
}
#[allow(clippy::cognitive_complexity)]
pub fn get_help(cmd: &dyn WholeStreamCommand, registry: &CommandRegistry) -> String {
let cmd_name = cmd.name();
let signature = cmd.signature();
let mut long_desc = String::new();
long_desc.push_str(&cmd.usage());
long_desc.push_str("\n");
let mut subcommands = String::new();
for name in registry.names() {
if name.starts_with(&format!("{} ", cmd_name)) {
let subcommand = registry.get_command(&name).expect("This shouldn't happen");
subcommands.push_str(&format!(" {} - {}\n", name, subcommand.usage()));
}
}
let mut one_liner = String::new();
one_liner.push_str(&signature.name);
one_liner.push_str(" ");
for positional in &signature.positional {
match &positional.0 {
PositionalType::Mandatory(name, _m) => {
one_liner.push_str(&format!("<{}> ", name));
}
PositionalType::Optional(name, _o) => {
one_liner.push_str(&format!("({}) ", name));
}
}
}
if signature.rest_positional.is_some() {
one_liner.push_str(" ...args");
}
if !subcommands.is_empty() {
one_liner.push_str("<subcommand> ");
}
if !signature.named.is_empty() {
one_liner.push_str("{flags} ");
}
long_desc.push_str(&format!("\nUsage:\n > {}\n", one_liner));
if !subcommands.is_empty() {
long_desc.push_str("\nSubcommands:\n");
long_desc.push_str(&subcommands);
}
if !signature.positional.is_empty() || signature.rest_positional.is_some() {
long_desc.push_str("\nParameters:\n");
for positional in &signature.positional {
match &positional.0 {
PositionalType::Mandatory(name, _m) => {
long_desc.push_str(&format!(" <{}> {}\n", name, positional.1));
}
PositionalType::Optional(name, _o) => {
long_desc.push_str(&format!(" ({}) {}\n", name, positional.1));
}
}
}
if let Some(rest_positional) = &signature.rest_positional {
long_desc.push_str(&format!(" ...args: {}\n", rest_positional.1));
}
}
if !signature.named.is_empty() {
long_desc.push_str(&get_flags_section(&signature))
}
let palette = crate::shell::palette::DefaultPalette {};
let examples = cmd.examples();
if !examples.is_empty() {
long_desc.push_str("\nExamples:");
}
for example in examples {
long_desc.push_str("\n");
long_desc.push_str(" ");
long_desc.push_str(example.description);
let colored_example =
crate::shell::helper::Painter::paint_string(example.example, registry, &palette);
long_desc.push_str(&format!("\n > {}\n", colored_example));
}
long_desc.push_str("\n");
long_desc
}
fn get_flags_section(signature: &Signature) -> String {
let mut long_desc = String::new();
long_desc.push_str("\nFlags:\n");
for (flag, ty) in &signature.named {
let msg = match ty.0 {
NamedType::Switch(s) => {
if let Some(c) = s {
format!(
" -{}, --{}{} {}\n",
c,
flag,
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
} else {
format!(
" --{}{} {}\n",
flag,
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
}
}
NamedType::Mandatory(s, m) => {
if let Some(c) = s {
format!(
" -{}, --{} <{}> (required parameter){} {}\n",
c,
flag,
m.display(),
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
} else {
format!(
" --{} <{}> (required parameter){} {}\n",
flag,
m.display(),
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
}
}
NamedType::Optional(s, o) => {
if let Some(c) = s {
format!(
" -{}, --{} <{}>{} {}\n",
c,
flag,
o.display(),
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
} else {
format!(
" --{} <{}>{} {}\n",
flag,
o.display(),
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
}
}
};
long_desc.push_str(&msg);
}
long_desc
}
#[cfg(test)]
mod tests {
use super::Help;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Help {})
}
}

View File

@ -0,0 +1,266 @@
use crate::commands::group_by::group;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::utils::data_processing::{columns_sorted, evaluate, map_max, reduce, t_sort};
use nu_errors::ShellError;
use nu_protocol::{
Primitive, ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue, Value,
};
use nu_source::Tagged;
use num_traits::{ToPrimitive, Zero};
pub struct Histogram;
#[derive(Deserialize)]
pub struct HistogramArgs {
column_name: Tagged<String>,
rest: Vec<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for Histogram {
fn name(&self) -> &str {
"histogram"
}
fn signature(&self) -> Signature {
Signature::build("histogram")
.required(
"column_name",
SyntaxShape::String,
"the name of the column to graph by",
)
.rest(
SyntaxShape::String,
"column name to give the histogram's frequency column",
)
}
fn usage(&self) -> &str {
"Creates a new table with a histogram based on the column name passed in."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
histogram(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Get a histogram for the types of files",
example: "ls | histogram type",
result: None,
},
Example {
description:
"Get a histogram for the types of files, with frequency column named count",
example: "ls | histogram type count",
result: None,
},
Example {
description: "Get a histogram for a list of numbers",
example: "echo [1 2 3 1 1 1 2 2 1 1] | histogram",
result: None,
},
]
}
}
pub async fn histogram(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (HistogramArgs { column_name, rest }, input) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await;
let values = UntaggedValue::table(&values).into_value(&name);
let groups = group(&Some(column_name.clone()), &values, &name)?;
let group_labels = columns_sorted(Some(column_name.clone()), &groups, &name);
let sorted = t_sort(Some(column_name.clone()), None, &groups, &name)?;
let evaled = evaluate(&sorted, None, &name)?;
let reduced = reduce(&evaled, None, &name)?;
let maxima = map_max(&reduced, None, &name)?;
let percents = percentages(&reduced, maxima, &name)?;
match percents {
Value {
value: UntaggedValue::Table(datasets),
..
} => {
let mut idx = 0;
let column_names_supplied: Vec<_> = rest.iter().map(|f| f.item.clone()).collect();
let frequency_column_name = if column_names_supplied.is_empty() {
"frequency".to_string()
} else {
column_names_supplied[0].clone()
};
let column = (*column_name).clone();
let count_column_name = "count".to_string();
let count_shell_error = ShellError::labeled_error(
"Unable to load group count",
"unabled to load group count",
&name,
);
let mut count_values: Vec<u64> = Vec::new();
for table_entry in reduced.table_entries() {
match table_entry {
Value {
value: UntaggedValue::Table(list),
..
} => {
for i in list {
if let Ok(count) = i.value.clone().into_value(&name).as_u64() {
count_values.push(count);
} else {
return Err(count_shell_error);
}
}
}
_ => {
return Err(count_shell_error);
}
}
}
if let Value {
value: UntaggedValue::Table(start),
..
} = datasets.get(0).ok_or_else(|| {
ShellError::labeled_error(
"Unable to load dataset",
"unabled to load dataset",
&name,
)
})? {
let start = start.clone();
Ok(
futures::stream::iter(start.into_iter().map(move |percentage| {
let mut fact = TaggedDictBuilder::new(&name);
let value: Tagged<String> = group_labels
.get(idx)
.ok_or_else(|| {
ShellError::labeled_error(
"Unable to load group labels",
"unabled to load group labels",
&name,
)
})?
.clone();
fact.insert_value(
&column,
UntaggedValue::string(value.item).into_value(value.tag),
);
fact.insert_untagged(
&count_column_name,
UntaggedValue::int(count_values[idx]),
);
if let Value {
value: UntaggedValue::Primitive(Primitive::Int(ref num)),
ref tag,
} = percentage
{
let string = std::iter::repeat("*")
.take(num.to_i32().ok_or_else(|| {
ShellError::labeled_error(
"Expected a number",
"expected a number",
tag,
)
})? as usize)
.collect::<String>();
fact.insert_untagged(
&frequency_column_name,
UntaggedValue::string(string),
);
}
idx += 1;
ReturnSuccess::value(fact.into_value())
}))
.to_output_stream(),
)
} else {
Ok(OutputStream::empty())
}
}
_ => Ok(OutputStream::empty()),
}
}
fn percentages(values: &Value, max: Value, tag: impl Into<Tag>) -> Result<Value, ShellError> {
let tag = tag.into();
let results: Value = match values {
Value {
value: UntaggedValue::Table(datasets),
..
} => {
let datasets: Vec<_> = datasets
.iter()
.map(|subsets| match subsets {
Value {
value: UntaggedValue::Table(data),
..
} => {
let data = data
.iter()
.map(|d| match d {
Value {
value: UntaggedValue::Primitive(Primitive::Int(n)),
..
} => {
let max = match &max {
Value {
value: UntaggedValue::Primitive(Primitive::Int(maxima)),
..
} => maxima.clone(),
_ => Zero::zero(),
};
let n = (n * 100) / max;
UntaggedValue::int(n).into_value(&tag)
}
_ => UntaggedValue::int(0).into_value(&tag),
})
.collect::<Vec<_>>();
UntaggedValue::Table(data).into_value(&tag)
}
_ => UntaggedValue::Table(vec![]).into_value(&tag),
})
.collect();
UntaggedValue::Table(datasets).into_value(&tag)
}
other => other.clone(),
};
Ok(results)
}
#[cfg(test)]
mod tests {
use super::Histogram;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Histogram {})
}
}

View File

@ -0,0 +1,67 @@
use crate::cli::History as HistoryFile;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
use std::fs::File;
use std::io::{BufRead, BufReader};
pub struct History;
#[async_trait]
impl WholeStreamCommand for History {
fn name(&self) -> &str {
"history"
}
fn signature(&self) -> Signature {
Signature::build("history")
}
fn usage(&self) -> &str {
"Display command history."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
history(args, registry)
}
}
fn history(args: CommandArgs, _registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag;
let history_path = HistoryFile::path();
let file = File::open(history_path);
if let Ok(file) = file {
let reader = BufReader::new(file);
let output = reader.lines().filter_map(move |line| match line {
Ok(line) => Some(ReturnSuccess::value(
UntaggedValue::string(line).into_value(tag.clone()),
)),
Err(_) => None,
});
Ok(futures::stream::iter(output).to_output_stream())
} else {
Err(ShellError::labeled_error(
"Could not open history",
"history file could not be opened",
tag,
))
}
}
#[cfg(test)]
mod tests {
use super::History;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(History {})
}
}

View File

@ -0,0 +1,83 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ColumnPath, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_value_ext::ValueExt;
pub struct Insert;
#[derive(Deserialize)]
pub struct InsertArgs {
column: ColumnPath,
value: Value,
}
#[async_trait]
impl WholeStreamCommand for Insert {
fn name(&self) -> &str {
"insert"
}
fn signature(&self) -> Signature {
Signature::build("insert")
.required(
"column",
SyntaxShape::ColumnPath,
"the column name to insert",
)
.required(
"value",
SyntaxShape::String,
"the value to give the cell(s)",
)
}
fn usage(&self) -> &str {
"Insert a new column with a given value."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
insert(args, registry).await
}
}
async fn insert(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (InsertArgs { column, value }, input) = args.process(&registry).await?;
Ok(input
.map(move |row| match row {
Value {
value: UntaggedValue::Row(_),
..
} => match row.insert_data_at_column_path(&column, value.clone()) {
Ok(v) => Ok(ReturnSuccess::Value(v)),
Err(err) => Err(err),
},
Value { tag, .. } => Err(ShellError::labeled_error(
"Unrecognized type in stream",
"original value",
tag,
)),
})
.to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Insert;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Insert {})
}
}

View File

@ -0,0 +1,211 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ColumnPath, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use nu_value_ext::ValueExt;
enum IsEmptyFor {
Value,
RowWithFieldsAndFallback(Vec<Tagged<ColumnPath>>, Value),
RowWithField(Tagged<ColumnPath>),
RowWithFieldAndFallback(Box<Tagged<ColumnPath>>, Value),
}
pub struct IsEmpty;
#[derive(Deserialize)]
pub struct IsEmptyArgs {
rest: Vec<Value>,
}
#[async_trait]
impl WholeStreamCommand for IsEmpty {
fn name(&self) -> &str {
"empty?"
}
fn signature(&self) -> Signature {
Signature::build("empty?").rest(
SyntaxShape::Any,
"the names of the columns to check emptiness followed by the replacement value.",
)
}
fn usage(&self) -> &str {
"Checks emptiness. The last value is the replacement value for any empty column(s) given to check against the table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
is_empty(args, registry).await
}
}
async fn is_empty(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (IsEmptyArgs { rest }, input) = args.process(&registry).await?;
Ok(input
.map(move |value| {
let value_tag = value.tag();
let action = if rest.len() <= 2 {
let field = rest.get(0);
let replacement_if_true = rest.get(1);
match (field, replacement_if_true) {
(Some(field), Some(replacement_if_true)) => {
IsEmptyFor::RowWithFieldAndFallback(
Box::new(field.as_column_path()?),
replacement_if_true.clone(),
)
}
(Some(field), None) => IsEmptyFor::RowWithField(field.as_column_path()?),
(_, _) => IsEmptyFor::Value,
}
} else {
// let no_args = vec![];
let mut arguments = rest.iter().rev();
let replacement_if_true = match arguments.next() {
Some(arg) => arg.clone(),
None => UntaggedValue::boolean(value.is_empty()).into_value(&value_tag),
};
IsEmptyFor::RowWithFieldsAndFallback(
arguments
.map(|a| a.as_column_path())
.filter_map(Result::ok)
.collect(),
replacement_if_true,
)
};
match action {
IsEmptyFor::Value => Ok(ReturnSuccess::Value(
UntaggedValue::boolean(value.is_empty()).into_value(value_tag),
)),
IsEmptyFor::RowWithFieldsAndFallback(fields, default) => {
let mut out = value;
for field in fields.iter() {
let val = crate::commands::get::get_column_path(&field, &out)?;
let emptiness_value = match out {
obj
@
Value {
value: UntaggedValue::Row(_),
..
} => {
if val.is_empty() {
match obj.replace_data_at_column_path(&field, default.clone()) {
Some(v) => Ok(v),
None => Err(ShellError::labeled_error(
"empty? could not find place to check emptiness",
"column name",
&field.tag,
)),
}
} else {
Ok(obj)
}
}
_ => Err(ShellError::labeled_error(
"Unrecognized type in stream",
"original value",
&value_tag,
)),
};
out = emptiness_value?;
}
Ok(ReturnSuccess::Value(out))
}
IsEmptyFor::RowWithField(field) => {
let val = crate::commands::get::get_column_path(&field, &value)?;
match &value {
obj
@
Value {
value: UntaggedValue::Row(_),
..
} => {
if val.is_empty() {
match obj.replace_data_at_column_path(
&field,
UntaggedValue::boolean(true).into_value(&value_tag),
) {
Some(v) => Ok(ReturnSuccess::Value(v)),
None => Err(ShellError::labeled_error(
"empty? could not find place to check emptiness",
"column name",
&field.tag,
)),
}
} else {
Ok(ReturnSuccess::Value(value))
}
}
_ => Err(ShellError::labeled_error(
"Unrecognized type in stream",
"original value",
&value_tag,
)),
}
}
IsEmptyFor::RowWithFieldAndFallback(field, default) => {
let val = crate::commands::get::get_column_path(&field, &value)?;
match &value {
obj
@
Value {
value: UntaggedValue::Row(_),
..
} => {
if val.is_empty() {
match obj.replace_data_at_column_path(&field, default) {
Some(v) => Ok(ReturnSuccess::Value(v)),
None => Err(ShellError::labeled_error(
"empty? could not find place to check emptiness",
"column name",
&field.tag,
)),
}
} else {
Ok(ReturnSuccess::Value(value))
}
}
_ => Err(ShellError::labeled_error(
"Unrecognized type in stream",
"original value",
&value_tag,
)),
}
}
}
})
.to_output_stream())
}
#[cfg(test)]
mod tests {
use super::IsEmpty;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(IsEmpty {})
}
}

View File

@ -0,0 +1,84 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
pub struct Keep;
#[derive(Deserialize)]
pub struct KeepArgs {
rows: Option<Tagged<usize>>,
}
#[async_trait]
impl WholeStreamCommand for Keep {
fn name(&self) -> &str {
"keep"
}
fn signature(&self) -> Signature {
Signature::build("keep").optional(
"rows",
SyntaxShape::Int,
"starting from the front, the number of rows to keep",
)
}
fn usage(&self) -> &str {
"Keep the number of rows only"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
keep(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Keep the first row",
example: "echo [1 2 3] | keep",
result: Some(vec![UntaggedValue::int(1).into()]),
},
Example {
description: "Keep the first four rows",
example: "echo [1 2 3 4 5] | keep 4",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(2).into(),
UntaggedValue::int(3).into(),
UntaggedValue::int(4).into(),
]),
},
]
}
}
async fn keep(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (KeepArgs { rows }, input) = args.process(&registry).await?;
let rows_desired = if let Some(quantity) = rows {
*quantity
} else {
1
};
Ok(input.take(rows_desired).to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Keep;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Keep {})
}
}

View File

@ -0,0 +1,123 @@
use crate::commands::WholeStreamCommand;
use crate::evaluate::evaluate_baseline_expr;
use crate::prelude::*;
use log::trace;
use nu_errors::ShellError;
use nu_protocol::{hir::ClassifiedCommand, Signature, SyntaxShape, UntaggedValue, Value};
pub struct KeepUntil;
#[async_trait]
impl WholeStreamCommand for KeepUntil {
fn name(&self) -> &str {
"keep-until"
}
fn signature(&self) -> Signature {
Signature::build("keep-until")
.required(
"condition",
SyntaxShape::Math,
"the condition that must be met to stop keeping rows",
)
.filter()
}
fn usage(&self) -> &str {
"Keeps rows until the condition matches."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = Arc::new(registry.clone());
let scope = Arc::new(args.call_info.scope.clone());
let call_info = args.evaluate_once(&registry).await?;
let block = call_info.args.expect_nth(0)?.clone();
let condition = Arc::new(match block {
Value {
value: UntaggedValue::Block(block),
tag,
} => {
if block.block.len() != 1 {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
match block.block[0].list.get(0) {
Some(item) => match item {
ClassifiedCommand::Expr(expr) => expr.clone(),
_ => {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
},
None => {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
}
}
Value { tag, .. } => {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
});
Ok(call_info
.input
.take_while(move |item| {
let condition = condition.clone();
let registry = registry.clone();
let scope = scope.clone();
let item = item.clone();
trace!("ITEM = {:?}", item);
async move {
let result = evaluate_baseline_expr(
&*condition,
&registry,
&item,
&scope.vars,
&scope.env,
)
.await;
trace!("RESULT = {:?}", result);
match result {
Ok(ref v) if v.is_true() => false,
_ => true,
}
}
})
.to_output_stream())
}
}
#[cfg(test)]
mod tests {
use super::KeepUntil;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(KeepUntil {})
}
}

View File

@ -0,0 +1,123 @@
use crate::commands::WholeStreamCommand;
use crate::evaluate::evaluate_baseline_expr;
use crate::prelude::*;
use log::trace;
use nu_errors::ShellError;
use nu_protocol::{hir::ClassifiedCommand, Signature, SyntaxShape, UntaggedValue, Value};
pub struct KeepWhile;
#[async_trait]
impl WholeStreamCommand for KeepWhile {
fn name(&self) -> &str {
"keep-while"
}
fn signature(&self) -> Signature {
Signature::build("keep-while")
.required(
"condition",
SyntaxShape::Math,
"the condition that must be met to keep rows",
)
.filter()
}
fn usage(&self) -> &str {
"Keeps rows while the condition matches."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = Arc::new(registry.clone());
let scope = Arc::new(args.call_info.scope.clone());
let call_info = args.evaluate_once(&registry).await?;
let block = call_info.args.expect_nth(0)?.clone();
let condition = Arc::new(match block {
Value {
value: UntaggedValue::Block(block),
tag,
} => {
if block.block.len() != 1 {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
match block.block[0].list.get(0) {
Some(item) => match item {
ClassifiedCommand::Expr(expr) => expr.clone(),
_ => {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
},
None => {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
}
}
Value { tag, .. } => {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
});
Ok(call_info
.input
.take_while(move |item| {
let condition = condition.clone();
let registry = registry.clone();
let scope = scope.clone();
let item = item.clone();
trace!("ITEM = {:?}", item);
async move {
let result = evaluate_baseline_expr(
&*condition,
&registry,
&item,
&scope.vars,
&scope.env,
)
.await;
trace!("RESULT = {:?}", result);
match result {
Ok(ref v) if v.is_true() => true,
_ => false,
}
}
})
.to_output_stream())
}
}
#[cfg(test)]
mod tests {
use super::KeepWhile;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(KeepWhile {})
}
}

View File

@ -0,0 +1,131 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape};
use nu_source::Tagged;
use std::process::{Command, Stdio};
pub struct Kill;
#[derive(Deserialize)]
pub struct KillArgs {
pub pid: Tagged<u64>,
pub rest: Vec<Tagged<u64>>,
pub force: Tagged<bool>,
pub quiet: Tagged<bool>,
}
#[async_trait]
impl WholeStreamCommand for Kill {
fn name(&self) -> &str {
"kill"
}
fn signature(&self) -> Signature {
Signature::build("kill")
.required(
"pid",
SyntaxShape::Int,
"process id of process that is to be killed",
)
.rest(SyntaxShape::Int, "rest of processes to kill")
.switch("force", "forcefully kill the process", Some('f'))
.switch("quiet", "won't print anything to the console", Some('q'))
}
fn usage(&self) -> &str {
"Kill a process using the process id."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
kill(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Kill the pid using the most memory",
example: "ps | sort-by mem | last | kill $it.pid",
result: None,
},
Example {
description: "Force kill a given pid",
example: "kill --force 12345",
result: None,
},
]
}
}
async fn kill(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (
KillArgs {
pid,
rest,
force,
quiet,
},
..,
) = args.process(&registry).await?;
let mut cmd = if cfg!(windows) {
let mut cmd = Command::new("taskkill");
if *force {
cmd.arg("/F");
}
cmd.arg("/PID");
cmd.arg(pid.item().to_string());
// each pid must written as `/PID 0` otherwise
// taskkill will act as `killall` unix command
for id in &rest {
cmd.arg("/PID");
cmd.arg(id.item().to_string());
}
cmd
} else {
let mut cmd = Command::new("kill");
if *force {
cmd.arg("-9");
}
cmd.arg(pid.item().to_string());
cmd.args(rest.iter().map(move |id| id.item().to_string()));
cmd
};
// pipe everything to null
if *quiet {
cmd.stdin(Stdio::null())
.stdout(Stdio::null())
.stderr(Stdio::null());
}
cmd.status().expect("failed to execute shell command");
Ok(OutputStream::empty())
}
#[cfg(test)]
mod tests {
use super::Kill;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Kill {})
}
}

View File

@ -0,0 +1,97 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct Last;
#[derive(Deserialize)]
pub struct LastArgs {
rows: Option<Tagged<u64>>,
}
#[async_trait]
impl WholeStreamCommand for Last {
fn name(&self) -> &str {
"last"
}
fn signature(&self) -> Signature {
Signature::build("last").optional(
"rows",
SyntaxShape::Number,
"starting from the back, the number of rows to return",
)
}
fn usage(&self) -> &str {
"Show only the last number of rows."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
last(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Get the last row",
example: "echo [1 2 3] | last",
result: Some(vec![Value::from(UntaggedValue::from(BigInt::from(3)))]),
},
Example {
description: "Get the last three rows",
example: "echo [1 2 3 4 5] | last 3",
result: Some(vec![
UntaggedValue::int(3).into(),
UntaggedValue::int(4).into(),
UntaggedValue::int(5).into(),
]),
},
]
}
}
async fn last(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (LastArgs { rows }, input) = args.process(&registry).await?;
let v: Vec<_> = input.into_vec().await;
let rows_desired = if let Some(quantity) = rows {
*quantity
} else {
1
};
let mut values_vec_deque = VecDeque::new();
let count = rows_desired as usize;
if count < v.len() {
let k = v.len() - count;
for x in v[k..].iter() {
values_vec_deque.push_back(ReturnSuccess::value(x.clone()));
}
}
Ok(futures::stream::iter(values_vec_deque).to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Last;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Last {})
}
}

View File

@ -0,0 +1,176 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
pub struct Lines;
#[async_trait]
impl WholeStreamCommand for Lines {
fn name(&self) -> &str {
"lines"
}
fn signature(&self) -> Signature {
Signature::build("lines")
}
fn usage(&self) -> &str {
"Split single string into rows, one per line."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
lines(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Split multi-line string into lines",
example: r#"^echo "two\nlines" | lines"#,
result: None,
}]
}
}
fn ends_with_line_ending(st: &str) -> bool {
let mut temp = st.to_string();
let last = temp.pop();
if let Some(c) = last {
c == '\n'
} else {
false
}
}
async fn lines(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let leftover = Arc::new(vec![]);
let leftover_string = Arc::new(String::new());
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let name_span = tag.span;
let eos = futures::stream::iter(vec![
UntaggedValue::Primitive(Primitive::EndOfStream).into_untagged_value()
]);
Ok(args
.input
.chain(eos)
.map(move |item| {
let mut leftover = leftover.clone();
let mut leftover_string = leftover_string.clone();
match item {
Value {
value: UntaggedValue::Primitive(Primitive::String(st)),
..
} => {
let st = (&*leftover_string).clone() + &st;
if let Some(leftover) = Arc::get_mut(&mut leftover) {
leftover.clear();
}
let mut lines: Vec<String> = st.lines().map(|x| x.to_string()).collect();
if !ends_with_line_ending(&st) {
if let Some(last) = lines.pop() {
if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
leftover_string.push_str(&last);
}
} else if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
}
} else if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
}
let success_lines: Vec<_> = lines
.iter()
.map(|x| ReturnSuccess::value(UntaggedValue::line(x).into_untagged_value()))
.collect();
futures::stream::iter(success_lines)
}
Value {
value: UntaggedValue::Primitive(Primitive::Line(st)),
..
} => {
let st = (&*leftover_string).clone() + &st;
if let Some(leftover) = Arc::get_mut(&mut leftover) {
leftover.clear();
}
let mut lines: Vec<String> = st.lines().map(|x| x.to_string()).collect();
if !ends_with_line_ending(&st) {
if let Some(last) = lines.pop() {
if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
leftover_string.push_str(&last);
}
} else if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
}
} else if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
}
let success_lines: Vec<_> = lines
.iter()
.map(|x| ReturnSuccess::value(UntaggedValue::line(x).into_untagged_value()))
.collect();
futures::stream::iter(success_lines)
}
Value {
value: UntaggedValue::Primitive(Primitive::EndOfStream),
..
} => {
if !leftover.is_empty() {
let mut st = (&*leftover_string).clone();
if let Ok(extra) = String::from_utf8((&*leftover).clone()) {
st.push_str(&extra);
}
// futures::stream::iter(vec![ReturnSuccess::value(
// UntaggedValue::string(st).into_untagged_value(),
// )])
if !st.is_empty() {
futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::string(&*leftover_string).into_untagged_value(),
)])
} else {
futures::stream::iter(vec![])
}
} else {
futures::stream::iter(vec![])
}
}
Value {
tag: value_span, ..
} => futures::stream::iter(vec![Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
name_span,
"value originates from here",
value_span,
))]),
}
})
.flatten()
.to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Lines;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Lines {})
}
}

View File

@ -0,0 +1,106 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape};
use nu_source::Tagged;
use std::path::PathBuf;
pub struct Ls;
#[derive(Deserialize)]
pub struct LsArgs {
pub path: Option<Tagged<PathBuf>>,
pub all: bool,
pub full: bool,
#[serde(rename = "short-names")]
pub short_names: bool,
#[serde(rename = "with-symlink-targets")]
pub with_symlink_targets: bool,
#[serde(rename = "du")]
pub du: bool,
}
#[async_trait]
impl WholeStreamCommand for Ls {
fn name(&self) -> &str {
"ls"
}
fn signature(&self) -> Signature {
Signature::build("ls")
.optional(
"path",
SyntaxShape::Pattern,
"a path to get the directory contents from",
)
.switch("all", "also show hidden files", Some('a'))
.switch(
"full",
"list all available columns for each entry",
Some('f'),
)
.switch(
"short-names",
"only print the file names and not the path",
Some('s'),
)
.switch(
"with-symlink-targets",
"display the paths to the target files that symlinks point to",
Some('w'),
)
.switch(
"du",
"display the apparent directory size in place of the directory metadata size",
Some('d'),
)
}
fn usage(&self) -> &str {
"View the contents of the current or given path."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = args.call_info.name_tag.clone();
let ctrl_c = args.ctrl_c.clone();
let shell_manager = args.shell_manager.clone();
let (args, _) = args.process(&registry).await?;
shell_manager.ls(args, name, ctrl_c)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "List all files in the current directory",
example: "ls",
result: None,
},
Example {
description: "List all files in a subdirectory",
example: "ls subdir",
result: None,
},
Example {
description: "List all rust files",
example: "ls *.rs",
result: None,
},
]
}
}
#[cfg(test)]
mod tests {
use super::Ls;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Ls {})
}
}

View File

@ -45,8 +45,8 @@ macro_rules! command {
stringify!($config_name)
}
fn config(&self) -> $crate::parser::registry::Signature {
$crate::parser::registry::Signature {
fn config(&self) -> $nu_parser::registry::Signature {
$nu_parser::registry::Signature {
name: self.name().to_string(),
positional: vec![$($mandatory_positional)*],
rest_positional: false,
@ -54,13 +54,13 @@ macro_rules! command {
is_sink: false,
named: {
use $crate::parser::registry::NamedType;
use $nu_parser::registry::NamedType;
#[allow(unused_mut)]
let mut named: indexmap::IndexMap<String, NamedType> = indexmap::IndexMap::new();
$(
named.insert(stringify!($named_param).to_string(), $crate::parser::registry::NamedType::$named_kind);
named.insert(stringify!($named_param).to_string(), $nu_parser::registry::NamedType::$named_kind);
)*
named
@ -114,7 +114,7 @@ macro_rules! command {
$($extract)* {
use std::convert::TryInto;
$args.get(stringify!($param_name)).clone().try_into()?
$args.get(stringify!($param_name)).try_into()?
}
}
);
@ -164,7 +164,7 @@ macro_rules! command {
$($extract)* {
use std::convert::TryInto;
$args.get(stringify!($param_name)).clone().try_into()?
$args.get(stringify!($param_name)).try_into()?
}
}
);
@ -214,7 +214,7 @@ macro_rules! command {
$($extract)* {
use std::convert::TryInto;
$args.get(stringify!($param_name)).clone().try_into()?
$args.get(stringify!($param_name)).try_into()?
}
}
);
@ -250,7 +250,7 @@ macro_rules! command {
Rest { $($rest)* }
Signature {
name: $config_name,
mandatory_positional: vec![ $($mandatory_positional)* $crate::parser::registry::PositionalType::mandatory_block(
mandatory_positional: vec![ $($mandatory_positional)* $nu_parser::registry::PositionalType::mandatory_block(
stringify!($param_name)
), ],
optional_positional: vec![ $($optional_positional)* ],
@ -305,7 +305,7 @@ macro_rules! command {
Rest { $($rest)* }
Signature {
name: $config_name,
mandatory_positional: vec![ $($mandatory_positional)* $crate::parser::registry::PositionalType::mandatory(
mandatory_positional: vec![ $($mandatory_positional)* $nu_parser::registry::PositionalType::mandatory(
stringify!($param_name), <$param_kind>::syntax_type()
), ],
optional_positional: vec![ $($optional_positional)* ],

View File

@ -0,0 +1,84 @@
use crate::commands::WholeStreamCommand;
use crate::data::value;
use crate::prelude::*;
use crate::utils::data_processing::map_max;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use num_traits::cast::ToPrimitive;
pub struct MapMaxBy;
#[derive(Deserialize)]
pub struct MapMaxByArgs {
column_name: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for MapMaxBy {
fn name(&self) -> &str {
"map-max-by"
}
fn signature(&self) -> Signature {
Signature::build("map-max-by").named(
"column_name",
SyntaxShape::String,
"the name of the column to map-max the table's rows",
Some('c'),
)
}
fn usage(&self) -> &str {
"Creates a new table with the data from the tables rows maxed by the column given."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
map_max_by(args, registry).await
}
}
pub async fn map_max_by(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (MapMaxByArgs { column_name }, mut input) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await;
if values.is_empty() {
Err(ShellError::labeled_error(
"Expected table from pipeline",
"requires a table input",
name,
))
} else {
let map_by_column = if let Some(column_to_map) = column_name {
Some(column_to_map.item().clone())
} else {
None
};
match map_max(&values[0], map_by_column, name) {
Ok(table_maxed) => Ok(OutputStream::one(ReturnSuccess::value(table_maxed))),
Err(err) => Err(err),
}
}
}
#[cfg(test)]
mod tests {
use super::MapMaxBy;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(MapMaxBy {})
}
}

Some files were not shown because too many files have changed in this diff Show More