* fix: absolutize path against its parent if it was a symlink.
On Linux this happens because Rust calls readlink but doesn't canonicalize the resultant path.
* feat: playground function to create symlinks
* fix: use playground dirs
* feat: test for #1631, shift tests names
* Creation of FilesystemShell with custom location may fail
* Replace ichwh with which
* Creation of FilesystemShell with custom location may fail
* Replace ichwh with which
* fix: add ichwh again since it cannot be completely replaced
* fix: replace one more use of which
For example, when running the following:
crates/nu-cli/src
nushell currently parses this as an external command. Before running the command, we check to see if
it's a directory. If it is, we "auto cd" into that directory, otherwise we go through normal
external processing.
If we put a trailing slash on it though, shells typically interpret that as "user is explicitly
referencing directory". So
crates/nu-cli/src/
should not be interpreted as "run an external command". We intercept a trailing slash in the head
position of a command in a pipeline as such, and inject a `cd` internal command.
* make trim apply to strings contained in tables (also at deeper nesting
levels), not just top-level strings
* remove unnecessary clone (thanks clippy)
* from-eml initial ver
* Adding tests for `from-eml`
* Add eml to prepares_and_decorates_filesystem_source_files
* Sort the file order
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
* fix: absolutize path against its parent if it was a symlink.
On Linux this happens because Rust calls readlink but doesn't canonicalize the resultant path.
* feat: playground function to create symlinks
* fix: use playground dirs
* feat: test for #1631, shift tests names
* refactor: expand_path and expand_ndots now work for any string.
* refactor: refactor test and add new ones.
* refactor: convert expanded to owned string
* feat: pub export of expand_ndots
* feat: add completion for ndots in fs-shell
* Fix --headerless option of to-csv and to-tsv
Before to-csv --headerless split the "headerfull" output on lines,
skipped the first, and then concatenated them together. That meant
that all remaining output would be put on the same line, without
any delimiter, making it unusable. Now we replace the range of the
first line with the empty string, maintaining the rest of the
output unchanged.
* Remove extra space in indentation of to_delimited_data
* Add --separator <string> argument to to-csv
This functionaliy has been present before, but wasn't exposed
to the command.
* Refactor InputStream and affected commands.
First, making `values` private and leaning on the `Stream` implementation makes
consumes of `InputStream` less likely to have to change in the future, if we
change what an `InputStream` is internally.
Second, we're dropping `Option<InputStream>` as the input to pipelines,
internals, and externals. Instead, `InputStream.is_empty` can be used to check
for "emptiness". Empty streams are typically only ever used as the first input
to a pipeline.
* Add run_external internal command.
We want to push external commands closer to internal commands, eventually
eliminating the concept of "external" completely. This means we can consolidate
a couple of things:
- Variable evaluation (for example, `$it`, `$nu`, alias vars)
- Behaviour of whole stream vs per-item external execution
It should also make it easier for us to start introducing argument signatures
for external commands,
* Update run_external.rs
* Update run_external.rs
* Update run_external.rs
* Update run_external.rs
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
* Better message error
* Use custom canonicalize in FileStructure build
* Better glob error in ls
* Use custom canonicalize, remove some duplicate code in cd.
* Enable recursive copying with patterns.
* Change test to fit new error message
* Test recursive with glob pattern
* Show that not matches were found in cp
* Fix typo in message error
* Change old canonicalize usage, follow newest changes
* Expand n dots early where tilde was also expanded.
* Remove normalize, not needed.
New function absolutize, doesn't follow links neither checks existence.
Renamed canonicalize_existing to canonicalize, works as expected.
* Remove normalize usages, change canonicalize.
* Treat strings as paths
* Making Commands match what UntaggedValue needs
* WIP
* WIP
* WIP
* Moved to expressions for conditions
* Add 'each' command to use command blocks
* More cleanup
* Add test for 'each'
* Instead use an expression block
* New 'path' module under nu-cli.
Added normalize and canonicalize method.
Added some unit tests.
* Replace old usages of normalize and canonicalize.
* Fix reading symlinks and existence logic.
* Better explained
1. Fixed a bug where `--all` wasn't showing files at the root directories.
2. More use of `Result`'s `map` and `map_err` methods.
3. Making tables be homogeneous so one can, for example, `get directories`.
Previously, if the user didn't have the appropriate permissions to execute the
binary/script, they would see "command not found", which is confusing.
This commit eliminates the `which` crate in favour of `ichwh`, which deals
better with permissions by not dealing with them at all! This is closer to the
behaviour of `which` in many shells. Permission checks are then left up to the
caller to deal with.
Mostly, making more use of `map` and `map_err` in `Result`. One benefit
is that at least one location had duplicated logic for how to map the
error, which is no longer the case after this commit.
An interruptible stream can query an `AtomicBool. If that bool is true,
the stream will no longer produce any values.
Also introducing the `Interruptible` trait, which extends any `Stream`
with the `interruptible` function, to simplify the construction and
allow chaining.
* headers plugin
* Remove plugin
* Add non-functioning headers command
* Add ability to extract headers from first row
* Refactor header extraction
* Rebuild indexmap with proper headers
* Rebuild result properly
* Compiling, probably wrapped too much?
* Refactoring
* Deal with case of empty header cell
* Deal with case of empty header cell
* Fix formatting
* Fix linting, attempt 2.
* Move whole_stream_command(Headers) to more appropriate section
* ... more linting
* Return Err(ShellError...) instead of panic, yield each row instead of entire table
* Insert Column[index] if no header info is found.
* Update error description
* Add initial test
* Add tests for headers command
* Lint test cases in headers
* Change ShellError for headers, Add sample_headers file to utils.rs
* Add empty sheet to test file
* Revert "Add empty sheet to test file"
This reverts commit a4bf38a31d.
* Show error message when given empty table
This makes the `binaries` function respect the `CARGO_TARGET_DIR` environment variable when set. If it's not present it falls back to the regular target directory used by Cargo.
* Fix '/' and '..' not being valid mv targets
If `/` or `../` is specified as the destination for `mv`, it will fail with an error message saying it's not a valid destination. This fixes it to account for the fact that `Path::file_name` return `None` when the file name evaluates to `/` or `..`. It will only take the slow(er) path if `Path::file_name` returns `None` in its initial check.
Fixes#1291
* Add test
* Add error message for csv parsing failures
* Add csv error prettyfier
* Improve readability of the error
Line 2: error is easier to understand than:
Line 2, error
* Remove unnecessary use of the format! macro
Replacing it with .to_string() fixes a clippy warning
* Improve consistency with JSON parsing errors
* Custom canonicalize method for FilesystemShell.
* Use custom canonicalize method.
Fixed missing import.
* Move function body to already impl body.
* Create test that aims to resolve.
* Utility function to detect hidden folders.
Implemented for Unix and Windows.
* Rename function argument.
* Revert "Rename function argument."
This reverts commit e7ab70f0f0.
* Add flag '--all/-a' to Ls
* Rename function argument.
* Check if flag '--all/-a' is present and path is hidden.
Replace match with map_err for glob result.
Remove redundancy in stream body.
Included comments on new stream body.
Replace async_stream::stream with async_stream::try_stream.
Minor tweaks to is_empty_dir.
Fix and refactor is_hidden_dir.
* Fix "implicit" bool coerse
* Fixed clippy errors
For some commands like `which` -h flag would trigger an error asking for
missing required parameters instead of printing the help message as it
does with --help. This commit adds a check in the command parser to
avoid that.
* Fix deleting named pipes
* Use std::os::unix::fs::FileTypeExt to show correct type for unix-specific fs objects; Fix formatting
Co-authored-by: Linards Kalvāns <linards.kalvans@twino.eu>
The extra newline character makes it hard to use nu as part of an
external processing pipeline, since the extra character could taint the
results. For example:
```
$ nu -c 'echo test | xxd'
00000000: 7465 7374 test
```
versus
```
nu -c 'echo test' | xxd
00000000: 7465 7374 0a test.
```
* WIP: move to bytes codec
* Progress on adding collect helpers
* Progress on adding collect helpers
* Add in line splitting back to lines
* Lines outputting line primitives
* Close to ready?
* Finish fixing lines
* clippy fixes
* fmt fixes
* removed unused code
* Cleanup a few bits
* Cleanup a few bits
* Cleanup a few more bits
* Fix failing test with corrected test case
* Fix and refactor cd for Filesystem Shell.
Reorder check conditions, don't check existence twice.
If building for unix check exec bit on folder.
* Import PermissionsExt only on unix target.
* It seems that this is the correct way?
This improves incremental build time when working on what was previously
the root package. For example, previously all plugins would be rebuilt
with a change to `src/commands/classified/external.rs`, but now only
`nu-cli` will have to be rebuilt (and anything that depends on it).
In particular, one thing that we can't (properly) do before this commit
is consuming an infinite input stream. For example:
```
yes | grep y | head -n10
```
will give 10 "y"s in most shells, but blocks indefinitely in nu. This PR
resolves that by doing blocking I/O in threads, and reducing the `await`
calls we currently have in our pipeline code.
* Add very basic documentation. Need to play with rest of the api to figure out what it does
* Add some documentation to more of the Plugin API methods
* fmt
* typo fixes
* Change signature to take in short-hand flags
* update help information
* Parse short-hand flags as their long counterparts
* lints
* Modified a couple tests to use shorthand flags
* Add some docs for meta.rs
* add better explanation for Span merging
* Add some doc tests - not sure how to get them to run
* get rid of doc comments for the temporary method
* add doc test for is_unknown
* fmt
* Refactor pipeline ahead of block changes. Add '-c' commandline option
* Update pipelining an error value
* Fmt
* Clippy
* Add stdin redirect for -c flag
* Add stdin redirect for -c flag
* Upgrade futures, async-stream, and futures_codec
These were the last three dependencies on futures-preview. `nu` itself
is now fully dependent on `futures@0.3`, as opposed to `futures-preview`
alpha.
Because the update to `futures` from `0.3.0-alpha.19` to `0.3.0` removed
the `Stream` implementation of `VecDeque` ([changelog][changelog]), most
commands that convert a `VecDeque` to an `OutputStream` broke and had to
be fixed.
The current solution is to now convert `VecDeque`s to a `Stream` via
`futures::stream::iter`. However, it may be useful for `futures` to
create an `IntoStream` trait, implemented on the `std::collections` (or
really any `IntoIterator`). If something like this happends, it may be
worthwhile to update the trait implementations on `OutputStream` and
refactor these commands again.
While upgrading `futures_codec`, we remove a custom implementation of
`LinesCodec`, as one has been added to the library. There's also a small
refactor to make the stream output more idiomatic.
[changelog]: https://github.com/rust-lang/futures-rs/blob/master/CHANGELOG.md#030---2019-11-5
* Upgrade sys & ps plugin dependencies
They were previously dependent on `futures-preview`, and `nu_plugin_ps`
was dependent on an old version of `futures-timer`.
* Remove dependency on futures-timer from nu
* Update Cargo.lock
* Fix formatting
* Revert fmt regressions
CI is still on 1.40.0, but the latest rustfmt v1.41.0 has changes to the
`val @ pattern` syntax, causing the linting job to fail.
* Fix clippy warnings
* add some notes into README for more elaboration
* rewrite the overview
* remove unused first line
* add last part about tracing and debugging
* change the wording to make it easier to read
* Add example of metadata system
* Add contact information as other helpful links
* compute directory sizes from contained files and directories
* De-lint
* Revert "De-lint"
This reverts commit 9df9fc07d777014fef8f5749a84b4e52e1ee652a.
* Revert "compute directory sizes from contained files and directories"
This reverts commit d43583e9aa20438bd613f78a36e641c9fd48cae3.
* Nu du command
* Nu du for you
* Add async support
* Lints
* so much bug fixing
* Switch to using `shell`
Switch to using the shell for subprocess to enable more natural shelling out.
* Update external.rs
* This is a test with .shell() for external
* El pollo loco's PR
* co co co
* Attempt to fix windows
* Fmt
* Less is more?
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
Restructure and streamline token expansion
The purpose of this commit is to streamline the token expansion code, by
removing aspects of the code that are no longer relevant, removing
pointless duplication, and eliminating the need to pass the same
arguments to `expand_syntax`.
The first big-picture change in this commit is that instead of a handful
of `expand_` functions, which take a TokensIterator and ExpandContext, a
smaller number of methods on the `TokensIterator` do the same job.
The second big-picture change in this commit is fully eliminating the
coloring traits, making coloring a responsibility of the base expansion
implementations. This also means that the coloring tracer is merged into
the expansion tracer, so you can follow a single expansion and see how
the expansion process produced colored tokens.
One side effect of this change is that the expander itself is marginally
more error-correcting. The error correction works by switching from
structured expansion to `BackoffColoringMode` when an unexpected token
is found, which guarantees that all spans of the source are colored, but
may not be the most optimal error recovery strategy.
That said, because `BackoffColoringMode` only extends as far as a
closing delimiter (`)`, `]`, `}`) or pipe (`|`), it does result in
fairly granular correction strategy.
The current code still produces an `Err` (plus a complete list of
colored shapes) from the parsing process if any errors are encountered,
but this could easily be addressed now that the underlying expansion is
error-correcting.
This commit also colors any spans that are syntax errors in red, and
causes the parser to include some additional information about what
tokens were expected at any given point where an error was encountered,
so that completions and hinting could be more robust in the future.
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
* WIP --help works for PerItemCommands.
* De-linting
* Add more comments (#1228)
* Add some more docs
* More docs
* More docs
* More docs (#1229)
* Add some more docs
* More docs
* More docs
* Add more docs
* External commands: wrap values that contain spaces in quotes (#1214) (#1220)
* External commands: wrap values that contain spaces in quotes (#1214)
* Add fn's argument_contains_whitespace & add_quotes (#1214)
* Fix formatting with cargo fmt
* Don't wrap argument in quotes when $it is already quoted (#1214)
* Implement --help for internal commands
* Externals now spawn independently. (#1230)
This commit changes the way we shell out externals when using the `"$it"` argument. Also pipes per row to an external's stdin if no `"$it"` argument is present for external commands.
Further separation of logic (preparing the external's command arguments, getting the data for piping, emitting values, spawning processes) will give us a better idea for lower level details regarding external commands until we can find the right abstractions for making them more generic and unify within the pipeline calling logic of Nu internal's and external's.
* Poll externals quicker. (#1231)
* WIP --help works for PerItemCommands.
* De-linting
* Implement --help for internal commands
* Make having --help the default
* Update test to include new default switch
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
Co-authored-by: Koenraad Verheyden <mail@koenraadverheyden.com>
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
This commit changes the way we shell out externals when using the `"$it"` argument. Also pipes per row to an external's stdin if no `"$it"` argument is present for external commands.
Further separation of logic (preparing the external's command arguments, getting the data for piping, emitting values, spawning processes) will give us a better idea for lower level details regarding external commands until we can find the right abstractions for making them more generic and unify within the pipeline calling logic of Nu internal's and external's.
* Clippy fixes
* Finish converting to use clippy
* fix warnings in new master
* fix windows
* fix windows
Co-authored-by: Artem Vorotnikov <artem@vorotnikov.me>
* start playing with ways to use the uniq command
* WIP
* Got uniq working, but still need to figure out args issue and add tests
* Add some tests for uniq
* fmt
* remove commented out code
* Add documentation and some additional tests showing uniq values and rows. Also removed args TODO
* add changes that didn't get committed
* whoops, I didn't save the docs correctly...
* fmt
* Add a test for uniq with nested json
* Add another test
* Fix unique-ness when json keys are out of order and make the test json more complicated
* Manifests check. Ignore doctests for now.
* We continue with refactorings towards the separation of concerns between
crates. `nu_plugin_inc` and `nu_plugin_str` common test helpers usage
has been refactored into `nu-plugin` value test helpers.
Inc also uses the new API for integration tests.
This commit contains two improvements:
- Support for a Range syntax (and a corresponding Range value)
- Work towards a signature syntax
Implementing the Range syntax resulted in cleaning up how operators in
the core syntax works. There are now two kinds of infix operators
- tight operators (`.` and `..`)
- loose operators
Tight operators may not be interspersed (`$it.left..$it.right` is a
syntax error). Loose operators require whitespace on both sides of the
operator, and can be arbitrarily interspersed. Precedence is left to
right in the core syntax.
Note that delimited syntax (like `( ... )` or `[ ... ]`) is a single
token node in the core syntax. A single token node can be parsed from
beginning to end in a context-free manner.
The rule for `.` is `<token node>.<member>`. The rule for `..` is
`<token node>..<token node>`.
Loose operators all have the same syntactic rule: `<token
node><space><loose op><space><token node>`.
The second aspect of this pull request is the beginning of support for a
signature syntax. Before implementing signatures, a necessary
prerequisite is for the core syntax to support multi-line programs.
That work establishes a few things:
- `;` and newlines are handled in the core grammar, and both count as
"separators"
- line comments begin with `#` and continue until the end of the line
In this commit, multi-token productions in the core grammar can use
separators interchangably with spaces. However, I think we will
ultimately want a different rule preventing separators from occurring
before an infix operator, so that the end of a line is always
unambiguous. This would avoid gratuitous differences between modules and
repl usage.
We already effectively have this rule, because otherwise `x<newline> |
y` would be a single pipeline, but of course that wouldn't work.
Previously, external words accidentally used
ExpansionRule::new().allow_external_command(), when it should have been
ExpansionRule::new().allow_external_word().
External words are the broadest category in the parser, and are the
appropriate category for external arguments. This was just a mistake.
Adds modules for internal, external, and dynamic commands, as well as
the pipeline functionality. These are exported as their old names from
the classified module so as to keep its "interface" the same.
`left =~ right` return true if left contains right, using Rust's
`String::contains`. `!~` is the negated version.
A new `apply_operator` function is added which decouples evaluation from
`Value::compare`. This returns a `Value` and opens the door to
implementing `+` for example, though it wouldn't be useful immediately.
The `operator!` macro had to be changed slightly as it would choke on
`~` in arguments.
This commit extracts five new crates:
- nu-source, which contains the core source-code handling logic in Nu,
including Text, Span, and also the pretty.rs-based debug logic
- nu-parser, which is the parser and expander logic
- nu-protocol, which is the bulk of the types and basic conveniences
used by plugins
- nu-errors, which contains ShellError, ParseError and error handling
conveniences
- nu-textview, which is the textview plugin extracted into a crate
One of the major consequences of this refactor is that it's no longer
possible to `impl X for Spanned<Y>` outside of the `nu-source` crate, so
a lot of types became more concrete (Value became a concrete type
instead of Spanned<Value>, for example).
This also turned a number of inherent methods in the main nu crate into
plain functions (impl Value {} became a bunch of functions in the
`value` namespace in `crate::data::value`).
This commit extracts Tag, Span, Text, as well as source-related debug
facilities into a new crate called nu_source.
This change is much bigger than one might have expected because the
previous code relied heavily on implementing inherent methods on
`Tagged<T>` and `Spanned<T>`, which is no longer possible.
As a result, this change creates more concrete types instead of using
`Tagged<T>`. One notable example: Tagged<Value> became Value, and Value
became UntaggedValue.
This change clarifies the intent of the code in many places, but it does
make it a big change.