* Working towards a PoC for wasm
* Move bson and sqlite to plugins
* proof of concept now working
* tests are green
* Add CI test for --no-default-features
* Fix some tests
* Fix clippy and windows build
* More fixes
* Fix the windows build
* Fix the windows test
* fix: absolutize path against its parent if it was a symlink.
On Linux this happens because Rust calls readlink but doesn't canonicalize the resultant path.
* feat: playground function to create symlinks
* fix: use playground dirs
* feat: test for #1631, shift tests names
* Making Commands match what UntaggedValue needs
* WIP
* WIP
* WIP
* Moved to expressions for conditions
* Add 'each' command to use command blocks
* More cleanup
* Add test for 'each'
* Instead use an expression block
This makes the `binaries` function respect the `CARGO_TARGET_DIR` environment variable when set. If it's not present it falls back to the regular target directory used by Cargo.
This improves incremental build time when working on what was previously
the root package. For example, previously all plugins would be rebuilt
with a change to `src/commands/classified/external.rs`, but now only
`nu-cli` will have to be rebuilt (and anything that depends on it).
In particular, one thing that we can't (properly) do before this commit
is consuming an infinite input stream. For example:
```
yes | grep y | head -n10
```
will give 10 "y"s in most shells, but blocks indefinitely in nu. This PR
resolves that by doing blocking I/O in threads, and reducing the `await`
calls we currently have in our pipeline code.
* Switch to using `shell`
Switch to using the shell for subprocess to enable more natural shelling out.
* Update external.rs
* This is a test with .shell() for external
* El pollo loco's PR
* co co co
* Attempt to fix windows
* Fmt
* Less is more?
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
Restructure and streamline token expansion
The purpose of this commit is to streamline the token expansion code, by
removing aspects of the code that are no longer relevant, removing
pointless duplication, and eliminating the need to pass the same
arguments to `expand_syntax`.
The first big-picture change in this commit is that instead of a handful
of `expand_` functions, which take a TokensIterator and ExpandContext, a
smaller number of methods on the `TokensIterator` do the same job.
The second big-picture change in this commit is fully eliminating the
coloring traits, making coloring a responsibility of the base expansion
implementations. This also means that the coloring tracer is merged into
the expansion tracer, so you can follow a single expansion and see how
the expansion process produced colored tokens.
One side effect of this change is that the expander itself is marginally
more error-correcting. The error correction works by switching from
structured expansion to `BackoffColoringMode` when an unexpected token
is found, which guarantees that all spans of the source are colored, but
may not be the most optimal error recovery strategy.
That said, because `BackoffColoringMode` only extends as far as a
closing delimiter (`)`, `]`, `}`) or pipe (`|`), it does result in
fairly granular correction strategy.
The current code still produces an `Err` (plus a complete list of
colored shapes) from the parsing process if any errors are encountered,
but this could easily be addressed now that the underlying expansion is
error-correcting.
This commit also colors any spans that are syntax errors in red, and
causes the parser to include some additional information about what
tokens were expected at any given point where an error was encountered,
so that completions and hinting could be more robust in the future.
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
This commit changes the way we shell out externals when using the `"$it"` argument. Also pipes per row to an external's stdin if no `"$it"` argument is present for external commands.
Further separation of logic (preparing the external's command arguments, getting the data for piping, emitting values, spawning processes) will give us a better idea for lower level details regarding external commands until we can find the right abstractions for making them more generic and unify within the pipeline calling logic of Nu internal's and external's.