* plugin: basic from_mp4 implementation
This patch introduces a very basic implementation of from_mp4, with only
a few bits of meta-data available. The rest of the available meta-data
(which is more than half left), will be included in a later patch
* Mp4: Almost all track metadata is implemented
Only meta-data that is not implemented is duration, facing some weird
issue I am going to check on later
* Mp4: All meta-data fields implemented
All meta-data fields that can be retrieved are now retrieved, with the
exception of duration for both tracks and the entire file itself because
there is still an issue. However, that will be fixed in the upcoming
patches
* fix: UntaggedValue::duration() serializes correctly now
Previous to this patch, there was an issue where when you would use
UntaggedValue::duration() it would result in an invalid JSONRPC
resulting string when using the protocol. This patch fixes this issue
* Mp4: Duration fixed for file and tracks
* plugins: Add plugin extra to src/plugins
* Mp4: Replace unwrap() with expect()
* Fix: Remove test mp4 file
We've relied on `clap` for building our cli app bootstrapping that figures out the positionals, flags, and other convenient facilities. Nu has been capable of solving this problem for quite some time. Given this and much more reasons (including the build time caused by `clap`) we start here working with our own.
* add query json plugin for experimentation
* add some error handling
* closer but Kind::Array is still horked
* unravel the table so the output looks right
* clippy
* added the ability to use gjson modifiers
With the current code it is possible to attach custom commands from
a custom binary, but only for interactive mode. This change makes
it possible to also customize the evaluation context for commands
and scripts.
* Revert "History, more test coverage improvements, and refactorings. (#3217)"
This reverts commit 8fc8fc89aa.
* Add tests
* Refactor .nu-env
* Change logic of Config write to logic of read()
* Fix reload always appends to old vars
* Fix reload always takes last_modified of global config
* Add reload_config in evaluation context
* Reload config after writing to it in cfg set / cfg set_into
* Add --no-history to cli options
* Use --no-history in tests
* Add comment about maybe_print_errors
* Get ctrl_exit var from context.global_config
* Use context.global_config in command "config"
* Add Readme in engine how env vars are now handled
* Update docs from autoenv command
* Move history_path from engine to nu_data
* Move load history out of if
* No let before return
* Add import for indexmap
Improvements overall to Nu. Also among the changes here, we can also be more confident towards incorporating `3041`. End to end tests for checking envs properly exported to externals is not added here (since it's in the other PR)
A few things added in this PR (probably forgetting some too)
* no writes happen to history during test runs.
* environment syncing end to end coverage added.
* clean up / refactorings few areas.
* testing API for finer control (can write tests passing more than one pipeline)
* can pass environment variables in tests that nu will inherit when running.
* No longer needed.
* no longer under a module. No need to use super.
* Playground infraestructure (tests, etc) additions.
A few things to note:
* Nu can be started with a custom configuration file (`nu --config-file /path/to/sample_config.toml`). Useful for mocking the configuration on test runs.
* When given a custom configuration file Nu will save any changes to the file supplied appropiately.
* The `$nu.config-path` variable either shows the default configuration file (or the custom one, if given)
* We can now run end to end tests with finer grained control (currently, since this is baseline work, standard out) This will allow to check things like exit status, assert the contents with a format, etc)
* Remove (for another PR)
* move commands, futures.rs, script.rs, utils
* move over maybe_print_errors
* add nu_command crate references to nu_cli
* in commands.rs open up to pub mod from pub(crate)
* nu-cli, nu-command, and nu tests are now passing
* cargo fmt
* clean up nu-cli/src/prelude.rs
* code cleanup
* for some reason lex.rs was not formatted, may be causing my error
* remove mod completion from lib.rs which was not being used along with quickcheck macros
* add in allow unused imports
* comment out one failing external test; comment out one failing internal test
* revert commenting out failing tests; something else might be going on; someone with a windows machine should check and see what is going on with these failing windows tests
* Update Cargo.toml
Extend the optional features to nu-command
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
* Begin allowing comments and multiline scripts.
* clippy
* Finish moving to groups. Test pass
* Keep going
* WIP
* WIP
* BROKEN WIP
* WIP
* WIP
* Fix more tests
* WIP: alias starts working
* Broken WIP
* Broken WIP
* Variables begin to work
* captures start working
* A little better but needs fixed scope
* Shorthand env setting
* Update main merge
* Broken WIP
* WIP
* custom command parsing
* Custom commands start working
* Fix coloring and parsing of block
* Almost there
* Add some tests
* Add more param types
* Bump version
* Fix benchmark
* Fix stuff
* first step of making selector
* wip
* wip tests working
* probably good enough for a first pass
* oops, missed something.
* and something else...
* grrrr version errors
We introduce the `plugin` nu sub command (`nu plugin`) with basic plugin
loading support. We can choose to load plugins from a directory. Originally
introduced to make integration tests faster (by not loading any plugins on startup at all)
but `nu plugin --load some_path ; test_pipeline_that_uses_plugins_just_loaded` does not see it.
Therefore, a `nu_with_plugins!` macro for tests was introduced on top of nu`s `--skip-plugins`
switch executable which is set to true when running the integration tests that use the `nu!` macro now..
This changeset contains everything that a separate binary needs to
register its own commands (including the new help function). It is
very possible that this commit misses other pub use exports, but
the contained ones work for our use cases so far.
This improves incremental build time when working on what was previously
the root package. For example, previously all plugins would be rebuilt
with a change to `src/commands/classified/external.rs`, but now only
`nu-cli` will have to be rebuilt (and anything that depends on it).
In particular, one thing that we can't (properly) do before this commit
is consuming an infinite input stream. For example:
```
yes | grep y | head -n10
```
will give 10 "y"s in most shells, but blocks indefinitely in nu. This PR
resolves that by doing blocking I/O in threads, and reducing the `await`
calls we currently have in our pipeline code.
* Create a function to create an empty directory entry
* Print an empty directory entry if permission is denied
* Fix rustfmt whitespace issues.
* Made metadata optional for `dir_entry_dict`.
Removed `empty_dir_entry_dict` as its not needed anymore.
* typo fixes
* Change signature to take in short-hand flags
* update help information
* Parse short-hand flags as their long counterparts
* lints
* Modified a couple tests to use shorthand flags
* Add block size to du
* Change blocks to physical size
* Use path instead of strings for file/directory names
* Why don't I just use paths instead of strings anyway?
* shorten physical size and apparent size to physical and apparent resp.
* Refactor pipeline ahead of block changes. Add '-c' commandline option
* Update pipelining an error value
* Fmt
* Clippy
* Add stdin redirect for -c flag
* Add stdin redirect for -c flag
* Fixed mv not throwing error when the source path was invalid
* Fixed failing test
* Fixed another lint error
* Fix $PATH conflicts in .gitpod.Dockerfile (#1349)
- Use the correct user for gitpod Dockerfile.
- Remove unneeded packages (curl, rustc) from gitpod Dockerfile.
* Added test to check for the error
* Fixed linting error
* Fixed mv not moving files on Windows. (#1342)
Move files correctly in windows.
* Fixed mv not throwing error when the source path was invalid
* Fixed failing test
* Fixed another lint error
* Added test to check for the error
* Fixed linting error
* Changed error message
* Typo and fixed test
Co-authored-by: Sean Hellum <seanhellum45@gmail.com>
* Upgrade futures, async-stream, and futures_codec
These were the last three dependencies on futures-preview. `nu` itself
is now fully dependent on `futures@0.3`, as opposed to `futures-preview`
alpha.
Because the update to `futures` from `0.3.0-alpha.19` to `0.3.0` removed
the `Stream` implementation of `VecDeque` ([changelog][changelog]), most
commands that convert a `VecDeque` to an `OutputStream` broke and had to
be fixed.
The current solution is to now convert `VecDeque`s to a `Stream` via
`futures::stream::iter`. However, it may be useful for `futures` to
create an `IntoStream` trait, implemented on the `std::collections` (or
really any `IntoIterator`). If something like this happends, it may be
worthwhile to update the trait implementations on `OutputStream` and
refactor these commands again.
While upgrading `futures_codec`, we remove a custom implementation of
`LinesCodec`, as one has been added to the library. There's also a small
refactor to make the stream output more idiomatic.
[changelog]: https://github.com/rust-lang/futures-rs/blob/master/CHANGELOG.md#030---2019-11-5
* Upgrade sys & ps plugin dependencies
They were previously dependent on `futures-preview`, and `nu_plugin_ps`
was dependent on an old version of `futures-timer`.
* Remove dependency on futures-timer from nu
* Update Cargo.lock
* Fix formatting
* Revert fmt regressions
CI is still on 1.40.0, but the latest rustfmt v1.41.0 has changes to the
`val @ pattern` syntax, causing the linting job to fail.
* Fix clippy warnings
* Add ability to have numbers in plugin name. Plugin must start with alphabetic char
* remove the first character as alphabetic requirement
* Update cli.rs
Going ahead and changing to plus to prevent issue notryanb found
* Update cli.rs
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
* Adding kill command, unclean code
* Removing old comments
* Added quiet option, supports variable number of ids
* Made it per_item_command, calling commands directly without the shell
Moves the state changes for setting and removing environment variables
into the context's host as opposed to calling `std::env::*` directly
from anywhere else.
Introduced FakeHost to prevent environemnt state changes leaking
between unit tests and cause random test failures.
* `ls` will return error if no files/folders match path/pattern
* Revert changes to src/data/files.rs
* Add a name_only flag to dir_entry_dict
Add name_only flag to indicate if the caller only cares about filenames
or wants the whole path
* Update ls changes from feedback
* Little cleanup
* Resolve merge conflicts
* lints
* Add `--with-symlink-targets` option for the `ls` command that displays a new column for the target files of symlinks
* Fix clippy warning
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
* compute directory sizes from contained files and directories
* De-lint
* Revert "De-lint"
This reverts commit 9df9fc07d777014fef8f5749a84b4e52e1ee652a.
* Revert "compute directory sizes from contained files and directories"
This reverts commit d43583e9aa20438bd613f78a36e641c9fd48cae3.
* Nu du command
* Nu du for you
* Add async support
* Lints
* so much bug fixing
* Added attributes to from-xml command
* Added attributes as their own rows
* Removed unneccesary lifetime declarations
* from-xml now has children and attributes side by side
* Fixed tests and linting
* Fixed lint-problem
* Switch to using `shell`
Switch to using the shell for subprocess to enable more natural shelling out.
* Update external.rs
* This is a test with .shell() for external
* El pollo loco's PR
* co co co
* Attempt to fix windows
* Fmt
* Less is more?
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
Restructure and streamline token expansion
The purpose of this commit is to streamline the token expansion code, by
removing aspects of the code that are no longer relevant, removing
pointless duplication, and eliminating the need to pass the same
arguments to `expand_syntax`.
The first big-picture change in this commit is that instead of a handful
of `expand_` functions, which take a TokensIterator and ExpandContext, a
smaller number of methods on the `TokensIterator` do the same job.
The second big-picture change in this commit is fully eliminating the
coloring traits, making coloring a responsibility of the base expansion
implementations. This also means that the coloring tracer is merged into
the expansion tracer, so you can follow a single expansion and see how
the expansion process produced colored tokens.
One side effect of this change is that the expander itself is marginally
more error-correcting. The error correction works by switching from
structured expansion to `BackoffColoringMode` when an unexpected token
is found, which guarantees that all spans of the source are colored, but
may not be the most optimal error recovery strategy.
That said, because `BackoffColoringMode` only extends as far as a
closing delimiter (`)`, `]`, `}`) or pipe (`|`), it does result in
fairly granular correction strategy.
The current code still produces an `Err` (plus a complete list of
colored shapes) from the parsing process if any errors are encountered,
but this could easily be addressed now that the underlying expansion is
error-correcting.
This commit also colors any spans that are syntax errors in red, and
causes the parser to include some additional information about what
tokens were expected at any given point where an error was encountered,
so that completions and hinting could be more robust in the future.
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
Also, this commit makes `ls` a per-item command.
A command that processes things item by item may still take some time to stream
out the results from a single item. For example, `ls` on a directory with a lot
of files could be interrupted in the middle of showing all of these files.
* WIP --help works for PerItemCommands.
* De-linting
* Add more comments (#1228)
* Add some more docs
* More docs
* More docs
* More docs (#1229)
* Add some more docs
* More docs
* More docs
* Add more docs
* External commands: wrap values that contain spaces in quotes (#1214) (#1220)
* External commands: wrap values that contain spaces in quotes (#1214)
* Add fn's argument_contains_whitespace & add_quotes (#1214)
* Fix formatting with cargo fmt
* Don't wrap argument in quotes when $it is already quoted (#1214)
* Implement --help for internal commands
* Externals now spawn independently. (#1230)
This commit changes the way we shell out externals when using the `"$it"` argument. Also pipes per row to an external's stdin if no `"$it"` argument is present for external commands.
Further separation of logic (preparing the external's command arguments, getting the data for piping, emitting values, spawning processes) will give us a better idea for lower level details regarding external commands until we can find the right abstractions for making them more generic and unify within the pipeline calling logic of Nu internal's and external's.
* Poll externals quicker. (#1231)
* WIP --help works for PerItemCommands.
* De-linting
* Implement --help for internal commands
* Make having --help the default
* Update test to include new default switch
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
Co-authored-by: Koenraad Verheyden <mail@koenraadverheyden.com>
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
This commit changes the way we shell out externals when using the `"$it"` argument. Also pipes per row to an external's stdin if no `"$it"` argument is present for external commands.
Further separation of logic (preparing the external's command arguments, getting the data for piping, emitting values, spawning processes) will give us a better idea for lower level details regarding external commands until we can find the right abstractions for making them more generic and unify within the pipeline calling logic of Nu internal's and external's.
* Put a sample_data.ods file for testing
This is a copy of the sample_data.xlsx file but in ods format
* Add the from-ods command
Most of the work was doing `rg xlsx` and then copy/paste with light editing
* Add tests for the from-ods command
* Fix failing test
The problem was improper filename sorting in the test `prepares_and_decorates_filesystem_source_files`
* Detect built-in commands passed as args to `which`
This expands the built-in `which` command to detect nushell commands
that may have the same name as a binary in the path.
* Allow which to interpret multiple arguments
Previously, it would discard any argument besides the first. This allows
`which` to process multiple arguments. It also makes the output a stream
of rows.
* Use map to build the output
* Add boolean column for builtins
* Use macros for entry creation shortcuts
* Process command args and use async_stream
In order to use `ichwh`, I'll need to use async_stream. But in order to
avoid lifetime errors with that, I have to process the command args
before using them. I'll admit I don't fully understand what is going on
with the `args.process(...)` function, but it works.
* Use `ichwh` for path searching
This commit transitions from `which` to `ichwh`. The path search is now
done asynchronously.
* Enable the `--all` flag on `which`
* Make `which` respect external commands
Escaped commands passed to wich (e.g., `which "^ls"`), are now searched
before builtins.
* Fix clippy warnings
This commit resolves two warnings from clippy, in light of #1142.
* Update Cargo.lock to get new `ichwh` version
`ichwh@0.2.1` has support for local paths.
* Add documentation for command
* Clippy fixes
* Finish converting to use clippy
* fix warnings in new master
* fix windows
* fix windows
Co-authored-by: Artem Vorotnikov <artem@vorotnikov.me>
* start playing with ways to use the uniq command
* WIP
* Got uniq working, but still need to figure out args issue and add tests
* Add some tests for uniq
* fmt
* remove commented out code
* Add documentation and some additional tests showing uniq values and rows. Also removed args TODO
* add changes that didn't get committed
* whoops, I didn't save the docs correctly...
* fmt
* Add a test for uniq with nested json
* Add another test
* Fix unique-ness when json keys are out of order and make the test json more complicated
* Manifests check. Ignore doctests for now.
* We continue with refactorings towards the separation of concerns between
crates. `nu_plugin_inc` and `nu_plugin_str` common test helpers usage
has been refactored into `nu-plugin` value test helpers.
Inc also uses the new API for integration tests.
This provides a huge performance boost for pipelines that end in an
external command. Rough testing shows an improvement from roughly 400ms
to 30ms when `cat`-ing a large file.
Add tests for ~tilde expansion:
- test that "~" is expanded (no more "~" in output)
- ensure that "1~1" is not expanded to "1/home/user1" as it was
before
Fixes#972
Note: the first test does not check the literal expansion because
the path on Windows is expanded as a Linux path, but the correct
expansion may come for free once `shellexpand` will use the `dirs`
crate too (https://github.com/netvl/shellexpand/issues/3).
This commit contains two improvements:
- Support for a Range syntax (and a corresponding Range value)
- Work towards a signature syntax
Implementing the Range syntax resulted in cleaning up how operators in
the core syntax works. There are now two kinds of infix operators
- tight operators (`.` and `..`)
- loose operators
Tight operators may not be interspersed (`$it.left..$it.right` is a
syntax error). Loose operators require whitespace on both sides of the
operator, and can be arbitrarily interspersed. Precedence is left to
right in the core syntax.
Note that delimited syntax (like `( ... )` or `[ ... ]`) is a single
token node in the core syntax. A single token node can be parsed from
beginning to end in a context-free manner.
The rule for `.` is `<token node>.<member>`. The rule for `..` is
`<token node>..<token node>`.
Loose operators all have the same syntactic rule: `<token
node><space><loose op><space><token node>`.
The second aspect of this pull request is the beginning of support for a
signature syntax. Before implementing signatures, a necessary
prerequisite is for the core syntax to support multi-line programs.
That work establishes a few things:
- `;` and newlines are handled in the core grammar, and both count as
"separators"
- line comments begin with `#` and continue until the end of the line
In this commit, multi-token productions in the core grammar can use
separators interchangably with spaces. However, I think we will
ultimately want a different rule preventing separators from occurring
before an infix operator, so that the end of a line is always
unambiguous. This would avoid gratuitous differences between modules and
repl usage.
We already effectively have this rule, because otherwise `x<newline> |
y` would be a single pipeline, but of course that wouldn't work.
Adds modules for internal, external, and dynamic commands, as well as
the pipeline functionality. These are exported as their old names from
the classified module so as to keep its "interface" the same.
`left =~ right` return true if left contains right, using Rust's
`String::contains`. `!~` is the negated version.
A new `apply_operator` function is added which decouples evaluation from
`Value::compare`. This returns a `Value` and opens the door to
implementing `+` for example, though it wouldn't be useful immediately.
The `operator!` macro had to be changed slightly as it would choke on
`~` in arguments.
This commit extracts five new crates:
- nu-source, which contains the core source-code handling logic in Nu,
including Text, Span, and also the pretty.rs-based debug logic
- nu-parser, which is the parser and expander logic
- nu-protocol, which is the bulk of the types and basic conveniences
used by plugins
- nu-errors, which contains ShellError, ParseError and error handling
conveniences
- nu-textview, which is the textview plugin extracted into a crate
One of the major consequences of this refactor is that it's no longer
possible to `impl X for Spanned<Y>` outside of the `nu-source` crate, so
a lot of types became more concrete (Value became a concrete type
instead of Spanned<Value>, for example).
This also turned a number of inherent methods in the main nu crate into
plain functions (impl Value {} became a bunch of functions in the
`value` namespace in `crate::data::value`).
`left =~ right` return true if left contains right, using Rust's
`String::contains`. `!~` is the negated version.
A new `apply_operator` function is added which decouples evaluation from
`Value::compare`. This returns a `Value` and opens the door to
implementing `+` for example, though it wouldn't be useful immediately.
The `operator!` macro had to be changed slightly as it would choke on
`~` in arguments.
After the previous commit, nushell uses PrettyDebug and
PrettyDebugWithSource for our pretty-printed display output.
PrettyDebug produces a structured `pretty.rs` document rather than
writing directly into a fmt::Formatter, and types that implement
`PrettyDebug` have a convenience `display` method that produces a string
(to be used in situations where `Display` is needed for compatibility
with other traits, or where simple rendering is appropriate).
This commit extracts Tag, Span, Text, as well as source-related debug
facilities into a new crate called nu_source.
This change is much bigger than one might have expected because the
previous code relied heavily on implementing inherent methods on
`Tagged<T>` and `Spanned<T>`, which is no longer possible.
As a result, this change creates more concrete types instead of using
`Tagged<T>`. One notable example: Tagged<Value> became Value, and Value
became UntaggedValue.
This change clarifies the intent of the code in many places, but it does
make it a big change.
Adds modules for internal, external, and dynamic commands, as well as
the pipeline functionality. These are exported as their old names from
the classified module so as to keep its "interface" the same.
fixes#969, admittedly without a --delimiter alias
moves from_structured_data.rs to from_delimited_data.rs to better
identify its scope and adds to_delimited_data.rs. Now csv and tsv both
use the same code, tsv passes in a fixed '\t' argument where csv passes
in the value of --separator
`save` attempts to convert input based on the target filename extension,
and expects a stream of text otherwise. However the error message is
unclear and provides little guidance, hopefully this is less confusing
to new users.
It might be worthwhile to also add a hint about adding an extension,
though I'm not sure if it's possible to emit multiple diagnostics.
tables and able to work with them for data processing & viewing
purposes. At the moment, certain ways to process said tables we
are able to view a histogram of a given column.
As usage matures, we may find certain core commands that could
be used ergonomically when working with tables on Nu.
The functions for retrieving, replacing, and inserting values into values all assumed they get the complete
column path as regular tagged strings. This commit changes for these to accept a tagged values instead. Basically
it means we can have column paths containing strings and numbers (eg. package.authors.1)
Unfortunately, for the moment all members when parsed and deserialized for a command that expects column paths
of tagged values will get tagged values (encapsulating Members) as strings only.
This makes it impossible to determine whether package.authors.1 package.authors."1" (meaning the "number" 1) is
a string member or a number member and thus prevents to know and force the user that paths enclosed in double
quotes means "retrieve the column at this given table" and that numbers are for retrieving a particular row number
from a table.
This commit sets in place the infraestructure needed when integer members land, in the mean time the workaround
is to convert back to strings the tagged values passed from the column paths.
The original purpose of this PR was to modernize the external parser to
use the new Shape system.
This commit does include some of that change, but a more important
aspect of this change is an improvement to the expansion trace.
Previous commit 6a7c00ea adding trace infrastructure to the syntax coloring
feature. This commit adds tracing to the expander.
The bulk of that work, in addition to the tree builder logic, was an
overhaul of the formatter traits to make them more general purpose, and
more structured.
Some highlights:
- `ToDebug` was split into two traits (`ToDebug` and `DebugFormat`)
because implementations needed to become objects, but a convenience
method on `ToDebug` didn't qualify
- `DebugFormat`'s `fmt_debug` method now takes a `DebugFormatter` rather
than a standard formatter, and `DebugFormatter` has a new (but still
limited) facility for structured formatting.
- Implementations of `ExpandSyntax` need to produce output that
implements `DebugFormat`.
Unlike the highlighter changes, these changes are fairly focused in the
trace output, so these changes aren't behind a flag.
Adds new substr function to str plugin with tests and documentation
Function takes a start/end location as a string in the form "##,##", both sides of comma are optional, and
behaves like Rust's own index operator [##..##].
a joy. Fundamentally we embrace functional programming principles for
transforming the dataset from any format picked up by Nu. This table
processing "primitive" commands will build up and make pipelines
composable with data processing capabilities allowing us the valuate,
reduce, and map, the tables as far as even composing this declartively.
On this regard, `split-by` expects some table with grouped data and we
can use it further in interesting ways (Eg. collecting labels for
visualizing the data in charts and/or suit it for a particular chart
of our interest).