nushell/src/parser/hir/tokens_iterator.rs
Yehuda Katz c2c10e2bc0 Overhaul the coloring system
This commit replaces the previous naive coloring system with a coloring
system that is more aligned with the parser.

The main benefit of this change is that it allows us to use parsing
rules to decide how to color tokens.

For example, consider the following syntax:

```
$ ps | where cpu > 10
```

Ideally, we could color `cpu` like a column name and not a string,
because `cpu > 10` is a shorthand block syntax that expands to
`{ $it.cpu > 10 }`.

The way that we know that it's a shorthand block is that the `where`
command declares that its first parameter is a `SyntaxShape::Block`,
which allows the shorthand block form.

In order to accomplish this, we need to color the tokens in a way that
corresponds to their expanded semantics, which means that high-fidelity
coloring requires expansion.

This commit adds a `ColorSyntax` trait that corresponds to the
`ExpandExpression` trait. The semantics are fairly similar, with a few
differences.

First `ExpandExpression` consumes N tokens and returns a single
`hir::Expression`. `ColorSyntax` consumes N tokens and writes M
`FlatShape` tokens to the output.

Concretely, for syntax like `[1 2 3]`

- `ExpandExpression` takes a single token node and produces a single
  `hir::Expression`
- `ColorSyntax` takes the same token node and emits 7 `FlatShape`s
  (open delimiter, int, whitespace, int, whitespace, int, close
  delimiter)

Second, `ColorSyntax` is more willing to plow through failures than
`ExpandExpression`.

In particular, consider syntax like

```
$ ps | where cpu >
```

In this case

- `ExpandExpression` will see that the `where` command is expecting a
  block, see that it's not a literal block and try to parse it as a
  shorthand block. It will successfully find a member followed by an
  infix operator, but not a following expression. That means that the
  entire pipeline part fails to parse and is a syntax error.
- `ColorSyntax` will also try to parse it as a shorthand block and
  ultimately fail, but it will fall back to "backoff coloring mode",
  which parsing any unidentified tokens in an unfallible, simple way. In
  this case, `cpu` will color as a string and `>` will color as an
  operator.

Finally, it's very important that coloring a pipeline infallibly colors
the entire string, doesn't fail, and doesn't get stuck in an infinite
loop.

In order to accomplish this, this PR separates `ColorSyntax`, which is
infallible from `FallibleColorSyntax`, which might fail. This allows the
type system to let us know if our coloring rules bottom out at at an
infallible rule.

It's not perfect: it's still possible for the coloring process to get
stuck or consume tokens non-atomically. I intend to reduce the
opportunity for those problems in a future commit. In the meantime, the
current system catches a number of mistakes (like trying to use a
fallible coloring rule in a loop without thinking about the possibility
that it will never terminate).
2019-10-10 19:30:04 -07:00

478 lines
11 KiB
Rust

pub(crate) mod debug;
use crate::errors::ShellError;
use crate::parser::TokenNode;
use crate::{Tag, Tagged, TaggedItem};
#[derive(Debug)]
pub struct TokensIterator<'content> {
tokens: &'content [TokenNode],
tag: Tag,
skip_ws: bool,
index: usize,
seen: indexmap::IndexSet<usize>,
}
#[derive(Debug)]
pub struct Checkpoint<'content, 'me> {
pub(crate) iterator: &'me mut TokensIterator<'content>,
index: usize,
seen: indexmap::IndexSet<usize>,
committed: bool,
}
impl<'content, 'me> Checkpoint<'content, 'me> {
pub(crate) fn commit(mut self) {
self.committed = true;
}
}
impl<'content, 'me> std::ops::Drop for Checkpoint<'content, 'me> {
fn drop(&mut self) {
if !self.committed {
self.iterator.index = self.index;
self.iterator.seen = self.seen.clone();
}
}
}
#[derive(Debug)]
pub struct Peeked<'content, 'me> {
pub(crate) node: Option<&'content TokenNode>,
iterator: &'me mut TokensIterator<'content>,
from: usize,
to: usize,
}
impl<'content, 'me> Peeked<'content, 'me> {
pub fn commit(&mut self) -> Option<&'content TokenNode> {
let Peeked {
node,
iterator,
from,
to,
} = self;
let node = (*node)?;
iterator.commit(*from, *to);
Some(node)
}
pub fn not_eof(
self,
expected: impl Into<String>,
) -> Result<PeekedNode<'content, 'me>, ShellError> {
match self.node {
None => Err(ShellError::unexpected_eof(
expected,
self.iterator.eof_tag(),
)),
Some(node) => Ok(PeekedNode {
node,
iterator: self.iterator,
from: self.from,
to: self.to,
}),
}
}
pub fn type_error(&self, expected: impl Into<String>) -> ShellError {
peek_error(&self.node, self.iterator.eof_tag(), expected)
}
}
#[derive(Debug)]
pub struct PeekedNode<'content, 'me> {
pub(crate) node: &'content TokenNode,
iterator: &'me mut TokensIterator<'content>,
from: usize,
to: usize,
}
impl<'content, 'me> PeekedNode<'content, 'me> {
pub fn commit(self) -> &'content TokenNode {
let PeekedNode {
node,
iterator,
from,
to,
} = self;
iterator.commit(from, to);
node
}
pub fn rollback(self) {}
pub fn type_error(&self, expected: impl Into<String>) -> ShellError {
peek_error(&Some(self.node), self.iterator.eof_tag(), expected)
}
}
pub fn peek_error(
node: &Option<&TokenNode>,
eof_tag: Tag,
expected: impl Into<String>,
) -> ShellError {
match node {
None => ShellError::unexpected_eof(expected, eof_tag),
Some(node) => ShellError::type_error(expected, node.tagged_type_name()),
}
}
impl<'content> TokensIterator<'content> {
pub fn new(items: &'content [TokenNode], tag: Tag, skip_ws: bool) -> TokensIterator<'content> {
TokensIterator {
tokens: items,
tag,
skip_ws,
index: 0,
seen: indexmap::IndexSet::new(),
}
}
pub fn anchor(&self) -> uuid::Uuid {
self.tag.anchor
}
pub fn all(tokens: &'content [TokenNode], tag: Tag) -> TokensIterator<'content> {
TokensIterator::new(tokens, tag, false)
}
pub fn len(&self) -> usize {
self.tokens.len()
}
pub fn spanned<T>(
&mut self,
block: impl FnOnce(&mut TokensIterator<'content>) -> T,
) -> Tagged<T> {
let start = self.tag_at_cursor();
let result = block(self);
let end = self.tag_at_cursor();
result.tagged(start.until(end))
}
/// Use a checkpoint when you need to peek more than one token ahead, but can't be sure
/// that you'll succeed.
pub fn checkpoint<'me>(&'me mut self) -> Checkpoint<'content, 'me> {
let index = self.index;
let seen = self.seen.clone();
Checkpoint {
iterator: self,
index,
seen,
committed: false,
}
}
/// Use a checkpoint when you need to peek more than one token ahead, but can't be sure
/// that you'll succeed.
pub fn atomic<'me, T>(
&'me mut self,
block: impl FnOnce(&mut TokensIterator<'content>) -> Result<T, ShellError>,
) -> Result<T, ShellError> {
let index = self.index;
let seen = self.seen.clone();
let checkpoint = Checkpoint {
iterator: self,
index,
seen,
committed: false,
};
let value = block(checkpoint.iterator)?;
checkpoint.commit();
return Ok(value);
}
fn eof_tag(&self) -> Tag {
Tag::from((self.tag.span.end(), self.tag.span.end(), self.tag.anchor))
}
pub fn typed_tag_at_cursor(&mut self) -> Tagged<&'static str> {
let next = self.peek_any();
match next.node {
None => "end".tagged(self.eof_tag()),
Some(node) => node.tagged_type_name(),
}
}
pub fn tag_at_cursor(&mut self) -> Tag {
let next = self.peek_any();
match next.node {
None => self.eof_tag(),
Some(node) => node.tag(),
}
}
pub fn remove(&mut self, position: usize) {
self.seen.insert(position);
}
pub fn at_end(&self) -> bool {
peek(self, self.skip_ws).is_none()
}
pub fn at_end_possible_ws(&self) -> bool {
peek(self, true).is_none()
}
pub fn advance(&mut self) {
self.seen.insert(self.index);
self.index += 1;
}
pub fn extract<T>(&mut self, f: impl Fn(&TokenNode) -> Option<T>) -> Option<(usize, T)> {
for (i, item) in self.tokens.iter().enumerate() {
if self.seen.contains(&i) {
continue;
}
match f(item) {
None => {
continue;
}
Some(value) => {
self.seen.insert(i);
return Some((i, value));
}
}
}
None
}
pub fn move_to(&mut self, pos: usize) {
self.index = pos;
}
pub fn restart(&mut self) {
self.index = 0;
}
pub fn clone(&self) -> TokensIterator<'content> {
TokensIterator {
tokens: self.tokens,
tag: self.tag,
index: self.index,
seen: self.seen.clone(),
skip_ws: self.skip_ws,
}
}
// Get the next token, not including whitespace
pub fn next_non_ws(&mut self) -> Option<&TokenNode> {
let mut peeked = start_next(self, true);
peeked.commit()
}
// Peek the next token, not including whitespace
pub fn peek_non_ws<'me>(&'me mut self) -> Peeked<'content, 'me> {
start_next(self, true)
}
// Peek the next token, including whitespace
pub fn peek_any<'me>(&'me mut self) -> Peeked<'content, 'me> {
start_next(self, false)
}
// Peek the next token, including whitespace, but not EOF
pub fn peek_any_token<'me, T>(
&'me mut self,
block: impl FnOnce(&'content TokenNode) -> Result<T, ShellError>,
) -> Result<T, ShellError> {
let peeked = start_next(self, false);
let peeked = peeked.not_eof("invariant");
match peeked {
Err(err) => return Err(err),
Ok(peeked) => match block(peeked.node) {
Err(err) => return Err(err),
Ok(val) => {
peeked.commit();
return Ok(val);
}
},
}
}
fn commit(&mut self, from: usize, to: usize) {
for index in from..to {
self.seen.insert(index);
}
self.index = to;
}
pub fn pos(&self, skip_ws: bool) -> Option<usize> {
peek_pos(self, skip_ws)
}
pub fn debug_remaining(&self) -> Vec<TokenNode> {
let mut tokens = self.clone();
tokens.restart();
tokens.cloned().collect()
}
}
impl<'content> Iterator for TokensIterator<'content> {
type Item = &'content TokenNode;
fn next(&mut self) -> Option<&'content TokenNode> {
next(self, self.skip_ws)
}
}
fn peek<'content, 'me>(
iterator: &'me TokensIterator<'content>,
skip_ws: bool,
) -> Option<&'me TokenNode> {
let mut to = iterator.index;
loop {
if to >= iterator.tokens.len() {
return None;
}
if iterator.seen.contains(&to) {
to += 1;
continue;
}
if to >= iterator.tokens.len() {
return None;
}
let node = &iterator.tokens[to];
match node {
TokenNode::Whitespace(_) if skip_ws => {
to += 1;
}
_ => {
return Some(node);
}
}
}
}
fn peek_pos<'content, 'me>(
iterator: &'me TokensIterator<'content>,
skip_ws: bool,
) -> Option<usize> {
let mut to = iterator.index;
loop {
if to >= iterator.tokens.len() {
return None;
}
if iterator.seen.contains(&to) {
to += 1;
continue;
}
if to >= iterator.tokens.len() {
return None;
}
let node = &iterator.tokens[to];
match node {
TokenNode::Whitespace(_) if skip_ws => {
to += 1;
}
_ => return Some(to),
}
}
}
fn start_next<'content, 'me>(
iterator: &'me mut TokensIterator<'content>,
skip_ws: bool,
) -> Peeked<'content, 'me> {
let from = iterator.index;
let mut to = iterator.index;
loop {
if to >= iterator.tokens.len() {
return Peeked {
node: None,
iterator,
from,
to,
};
}
if iterator.seen.contains(&to) {
to += 1;
continue;
}
if to >= iterator.tokens.len() {
return Peeked {
node: None,
iterator,
from,
to,
};
}
let node = &iterator.tokens[to];
match node {
TokenNode::Whitespace(_) if skip_ws => {
to += 1;
}
_ => {
to += 1;
return Peeked {
node: Some(node),
iterator,
from,
to,
};
}
}
}
}
fn next<'me, 'content>(
iterator: &'me mut TokensIterator<'content>,
skip_ws: bool,
) -> Option<&'content TokenNode> {
loop {
if iterator.index >= iterator.tokens.len() {
return None;
}
if iterator.seen.contains(&iterator.index) {
iterator.advance();
continue;
}
if iterator.index >= iterator.tokens.len() {
return None;
}
match &iterator.tokens[iterator.index] {
TokenNode::Whitespace(_) if skip_ws => {
iterator.advance();
}
other => {
iterator.advance();
return Some(other);
}
}
}
}