Commit Graph

19 Commits

Author SHA1 Message Date
Leonhard Kipp
3e6e3a207c
Feature/def signature with comments (#2905)
* Put parse_definition related funcs into own module

* Add failing lexer test

* Implement Parsing of definition signature

This commit applied changes how the signature of a function is parsed. Before
there was a little bit of "quick-and-dirty" string-matching/parsing involved.
Now, a signature is a little bit more properly parsed.
The grammar of a definition signature understood by these parsing-functions is
as follows:
 `[ (parameter | flag | <eol>)* ]`
where
parameter is:
    `name (<:> type)? (<,> | <eol> | (#Comment <eol>))?`
flag is:
    `--name (-shortform)? (<:> type)? (<,> | <eol> | (#Comment <eol>))?`
(Note: After the last item no <,> has to come.)
Note: It is now possible to pass comments to flags and parameters
Example:
[
  d:int          # The required d parameter
  --x (-x):string # The all powerful x flag
  --y (-y):int    # The accompanying y flag
]

(Sadly there seems to be a bug (Or is this expected behaviour?) in the lexer, because of which `--x(-x)` would
be treated as one baseline token and is therefore not correctly recognized as 2. For
now a space has to be inserted)

During the implementation of the module, 2 question arose:
Should flag/parameter names be allowed to be type names?
Example case:
```shell
def f [ string ] { echo $string }
```
Currently an error is thrown

* Fix clippy lints

* Remove wrong comment

* Add spacing

* Add Cargo.lock
2021-01-12 06:53:58 +13:00
Jonathan Turner
c4daa2e40f
Add experimental new parser (#1554)
Move to an experimental new parser
2020-04-06 19:16:14 +12:00
Andrés N. Robalino
b36d21e76f
Infer types from regular delimited plain text unstructured files. (#1494)
* Infer types from regular delimited plain text unstructured files.

* Nothing resolves to an empty string.
2020-03-16 15:50:45 -05:00
Kevin DCR
b6363f3ce1
Added new flag '--all/-a' for Ls, also refactor some code (#1483)
* Utility function to detect hidden folders.
Implemented for Unix and Windows.

* Rename function argument.

* Revert "Rename function argument."

This reverts commit e7ab70f0f0.

* Add flag '--all/-a' to Ls

* Rename function argument.

* Check if flag '--all/-a' is present and path is hidden.
Replace match with map_err for glob result.
Remove redundancy in stream body.
Included comments on new stream body.
Replace async_stream::stream with async_stream::try_stream.
Minor tweaks to is_empty_dir.
Fix and refactor is_hidden_dir.

* Fix "implicit" bool coerse

* Fixed clippy errors
2020-03-14 06:27:04 +13:00
Andrés N. Robalino
a2443fbe02
Remove unused parsing logic. (#1473)
* Remove unused parsing logic.

* Run tokens iteration tests baseline.

* Pass lint.

* lifetimes can be elided without being explicit.
2020-03-10 04:31:42 -05:00
Corvus Corax
6d096206b6
Add support for compound shorthand flags (#1414)
* Break multicharacter shorthand flags into single character flags

* Remove shorthand flag test
2020-03-01 13:20:42 +13:00
Andrés N. Robalino
18d988d4c8
Restrict short-hand flag detection to exact match. (#1406) 2020-02-18 01:58:30 -05:00
Andrés N. Robalino
73312b506f
Finer grained parsing and coloring command tail. (#1382) 2020-02-12 20:20:19 -05:00
Corvus Corax
c0be02a434
Short-hand flags (#1378)
* typo fixes

* Change signature to take in short-hand flags

* update help information

* Parse short-hand flags as their long counterparts

* lints

* Modified a couple tests to use shorthand flags
2020-02-11 18:24:31 -08:00
Andrés N. Robalino
3610baa227
Default plugins are independent and called from Nu. (#1322) 2020-01-31 17:45:33 -05:00
Jonathan Turner
798a24eda5
Soften restrictions for external parameters (#1277)
* Soften restrictions for external parameters

* Add test
2020-01-25 08:14:49 +13:00
Yehuda Katz
7efb31a4e4 Restructure and streamline token expansion (#1123)
Restructure and streamline token expansion

The purpose of this commit is to streamline the token expansion code, by
removing aspects of the code that are no longer relevant, removing
pointless duplication, and eliminating the need to pass the same
arguments to `expand_syntax`.

The first big-picture change in this commit is that instead of a handful
of `expand_` functions, which take a TokensIterator and ExpandContext, a
smaller number of methods on the `TokensIterator` do the same job.

The second big-picture change in this commit is fully eliminating the
coloring traits, making coloring a responsibility of the base expansion
implementations. This also means that the coloring tracer is merged into
the expansion tracer, so you can follow a single expansion and see how
the expansion process produced colored tokens.

One side effect of this change is that the expander itself is marginally
more error-correcting. The error correction works by switching from
structured expansion to `BackoffColoringMode` when an unexpected token
is found, which guarantees that all spans of the source are colored, but
may not be the most optimal error recovery strategy.

That said, because `BackoffColoringMode` only extends as far as a
closing delimiter (`)`, `]`, `}`) or pipe (`|`), it does result in
fairly granular correction strategy.

The current code still produces an `Err` (plus a complete list of
colored shapes) from the parsing process if any errors are encountered,
but this could easily be addressed now that the underlying expansion is
error-correcting.

This commit also colors any spans that are syntax errors in red, and
causes the parser to include some additional information about what
tokens were expected at any given point where an error was encountered,
so that completions and hinting could be more robust in the future.

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
2020-01-21 17:45:03 -05:00
Jonathan Turner
77d856fd53
Last unwraps (#1160)
* Work through most of the last unwraps

* Finish removing unwraps
2020-01-04 19:44:17 +13:00
Jonathan Turner
5919c6c433
Remove unwraps (#1153)
* Remove a batch of unwraps

* finish another batch
2020-01-04 10:11:21 +13:00
Jonathan Turner
72838cc083
Move to using clippy (#1142)
* Clippy fixes

* Finish converting to use clippy

* fix warnings in new master

* fix windows

* fix windows

Co-authored-by: Artem Vorotnikov <artem@vorotnikov.me>
2019-12-31 20:36:08 +13:00
Jonathan Turner
68a314b5cb UTF8 fix for twitter-reported issue 2019-12-27 19:03:00 +13:00
Yehuda Katz
57af9b5040 Add Range and start Signature support
This commit contains two improvements:

- Support for a Range syntax (and a corresponding Range value)
- Work towards a signature syntax

Implementing the Range syntax resulted in cleaning up how operators in
the core syntax works. There are now two kinds of infix operators

- tight operators (`.` and `..`)
- loose operators

Tight operators may not be interspersed (`$it.left..$it.right` is a
syntax error). Loose operators require whitespace on both sides of the
operator, and can be arbitrarily interspersed. Precedence is left to
right in the core syntax.

Note that delimited syntax (like `( ... )` or `[ ... ]`) is a single
token node in the core syntax. A single token node can be parsed from
beginning to end in a context-free manner.

The rule for `.` is `<token node>.<member>`. The rule for `..` is
`<token node>..<token node>`.

Loose operators all have the same syntactic rule: `<token
node><space><loose op><space><token node>`.

The second aspect of this pull request is the beginning of support for a
signature syntax. Before implementing signatures, a necessary
prerequisite is for the core syntax to support multi-line programs.

That work establishes a few things:

- `;` and newlines are handled in the core grammar, and both count as
  "separators"
- line comments begin with `#` and continue until the end of the line

In this commit, multi-token productions in the core grammar can use
separators interchangably with spaces. However, I think we will
ultimately want a different rule preventing separators from occurring
before an infix operator, so that the end of a line is always
unambiguous. This would avoid gratuitous differences between modules and
repl usage.

We already effectively have this rule, because otherwise `x<newline> |
y` would be a single pipeline, but of course that wouldn't work.
2019-12-11 16:41:07 -08:00
Yehuda Katz
4115634bfc Try to re-apply #1039 2019-12-02 11:02:58 -08:00
Yehuda Katz
e4226def16 Extract core stuff into own crates
This commit extracts five new crates:

- nu-source, which contains the core source-code handling logic in Nu,
  including Text, Span, and also the pretty.rs-based debug logic
- nu-parser, which is the parser and expander logic
- nu-protocol, which is the bulk of the types and basic conveniences
  used by plugins
- nu-errors, which contains ShellError, ParseError and error handling
  conveniences
- nu-textview, which is the textview plugin extracted into a crate

One of the major consequences of this refactor is that it's no longer
possible to `impl X for Spanned<Y>` outside of the `nu-source` crate, so
a lot of types became more concrete (Value became a concrete type
instead of Spanned<Value>, for example).

This also turned a number of inherent methods in the main nu crate into
plain functions (impl Value {} became a bunch of functions in the
`value` namespace in `crate::data::value`).
2019-12-02 10:54:12 -08:00