Overhaul the coloring system

This commit replaces the previous naive coloring system with a coloring
system that is more aligned with the parser.

The main benefit of this change is that it allows us to use parsing
rules to decide how to color tokens.

For example, consider the following syntax:

```
$ ps | where cpu > 10
```

Ideally, we could color `cpu` like a column name and not a string,
because `cpu > 10` is a shorthand block syntax that expands to
`{ $it.cpu > 10 }`.

The way that we know that it's a shorthand block is that the `where`
command declares that its first parameter is a `SyntaxShape::Block`,
which allows the shorthand block form.

In order to accomplish this, we need to color the tokens in a way that
corresponds to their expanded semantics, which means that high-fidelity
coloring requires expansion.

This commit adds a `ColorSyntax` trait that corresponds to the
`ExpandExpression` trait. The semantics are fairly similar, with a few
differences.

First `ExpandExpression` consumes N tokens and returns a single
`hir::Expression`. `ColorSyntax` consumes N tokens and writes M
`FlatShape` tokens to the output.

Concretely, for syntax like `[1 2 3]`

- `ExpandExpression` takes a single token node and produces a single
  `hir::Expression`
- `ColorSyntax` takes the same token node and emits 7 `FlatShape`s
  (open delimiter, int, whitespace, int, whitespace, int, close
  delimiter)

Second, `ColorSyntax` is more willing to plow through failures than
`ExpandExpression`.

In particular, consider syntax like

```
$ ps | where cpu >
```

In this case

- `ExpandExpression` will see that the `where` command is expecting a
  block, see that it's not a literal block and try to parse it as a
  shorthand block. It will successfully find a member followed by an
  infix operator, but not a following expression. That means that the
  entire pipeline part fails to parse and is a syntax error.
- `ColorSyntax` will also try to parse it as a shorthand block and
  ultimately fail, but it will fall back to "backoff coloring mode",
  which parsing any unidentified tokens in an unfallible, simple way. In
  this case, `cpu` will color as a string and `>` will color as an
  operator.

Finally, it's very important that coloring a pipeline infallibly colors
the entire string, doesn't fail, and doesn't get stuck in an infinite
loop.

In order to accomplish this, this PR separates `ColorSyntax`, which is
infallible from `FallibleColorSyntax`, which might fail. This allows the
type system to let us know if our coloring rules bottom out at at an
infallible rule.

It's not perfect: it's still possible for the coloring process to get
stuck or consume tokens non-atomically. I intend to reduce the
opportunity for those problems in a future commit. In the meantime, the
current system catches a number of mistakes (like trying to use a
fallible coloring rule in a loop without thinking about the possibility
that it will never terminate).
This commit is contained in:
Yehuda Katz 2019-10-06 13:22:50 -07:00
parent 1ad9d6f199
commit c2c10e2bc0
50 changed files with 3527 additions and 845 deletions

View File

@ -96,7 +96,7 @@ textview = ["syntect", "onig_sys", "crossterm"]
binaryview = ["image", "crossterm"]
sys = ["heim", "battery"]
ps = ["heim"]
trace = ["nom-tracable/trace"]
# trace = ["nom-tracable/trace"]
all = ["raw-key", "textview", "binaryview", "sys", "ps", "clipboard", "ptree"]
[dependencies.rusqlite]

View File

@ -14,9 +14,9 @@ use crate::git::current_branch;
use crate::parser::registry::Signature;
use crate::parser::{
hir,
hir::syntax_shape::{CommandHeadShape, CommandSignature, ExpandSyntax},
hir::syntax_shape::{expand_syntax, PipelineShape},
hir::{expand_external_tokens::expand_external_tokens, tokens_iterator::TokensIterator},
parse_command_tail, Pipeline, PipelineElement, TokenNode,
TokenNode,
};
use crate::prelude::*;
@ -99,11 +99,17 @@ fn load_plugin(path: &std::path::Path, context: &mut Context) -> Result<(), Shel
},
Err(e) => {
trace!("incompatible plugin {:?}", input);
Err(ShellError::string(format!("Error: {:?}", e)))
Err(ShellError::untagged_runtime_error(format!(
"Error: {:?}",
e
)))
}
}
}
Err(e) => Err(ShellError::string(format!("Error: {:?}", e))),
Err(e) => Err(ShellError::untagged_runtime_error(format!(
"Error: {:?}",
e
))),
};
let _ = child.wait();
@ -319,6 +325,7 @@ pub async fn cli() -> Result<(), Box<dyn Error>> {
)]);
}
}
let _ = load_plugins(&mut context);
let config = Config::builder().color_mode(ColorMode::Forced).build();
@ -347,9 +354,7 @@ pub async fn cli() -> Result<(), Box<dyn Error>> {
let cwd = context.shell_manager.path();
rl.set_helper(Some(crate::shell::Helper::new(
context.shell_manager.clone(),
)));
rl.set_helper(Some(crate::shell::Helper::new(context.clone())));
let edit_mode = config::config(Tag::unknown())?
.get("edit_mode")
@ -476,7 +481,7 @@ async fn process_line(readline: Result<String, ReadlineError>, ctx: &mut Context
Ok(line) => {
let line = chomp_newline(line);
let result = match crate::parser::parse(&line, uuid::Uuid::new_v4()) {
let result = match crate::parser::parse(&line, uuid::Uuid::nil()) {
Err(err) => {
return LineResult::Error(line.to_string(), err);
}
@ -614,74 +619,14 @@ fn classify_pipeline(
context: &Context,
source: &Text,
) -> Result<ClassifiedPipeline, ShellError> {
let pipeline = pipeline.as_pipeline()?;
let mut pipeline_list = vec![pipeline.clone()];
let mut iterator = TokensIterator::all(&mut pipeline_list, pipeline.tag());
let Pipeline { parts, .. } = pipeline;
let commands: Result<Vec<_>, ShellError> = parts
.iter()
.map(|item| classify_command(&item, context, &source))
.collect();
Ok(ClassifiedPipeline {
commands: commands?,
})
}
fn classify_command(
command: &Tagged<PipelineElement>,
context: &Context,
source: &Text,
) -> Result<ClassifiedCommand, ShellError> {
let mut iterator = TokensIterator::new(&command.tokens.item, command.tag, true);
let head = CommandHeadShape
.expand_syntax(&mut iterator, &context.expand_context(source, command.tag))?;
match &head {
CommandSignature::Expression(_) => Err(ShellError::syntax_error(
"Unexpected expression in command position".tagged(command.tag),
)),
// If the command starts with `^`, treat it as an external command no matter what
CommandSignature::External(name) => {
let name_str = name.slice(source);
external_command(&mut iterator, source, name_str.tagged(name))
}
CommandSignature::LiteralExternal { outer, inner } => {
let name_str = inner.slice(source);
external_command(&mut iterator, source, name_str.tagged(outer))
}
CommandSignature::Internal(command) => {
let tail = parse_command_tail(
&command.signature(),
&context.expand_context(source, command.tag),
expand_syntax(
&PipelineShape,
&mut iterator,
command.tag,
)?;
let (positional, named) = match tail {
None => (None, None),
Some((positional, named)) => (positional, named),
};
let call = hir::Call {
head: Box::new(head.to_expression()),
positional,
named,
};
Ok(ClassifiedCommand::Internal(InternalCommand::new(
command.name().to_string(),
command.tag,
call,
)))
}
}
&context.expand_context(source, pipeline.tag()),
)
}
// Classify this command as an external command, which doesn't give special meaning

View File

@ -58,21 +58,21 @@ pub fn autoview(
}
}
};
// } else if is_single_origined_text_value(&input) {
// let text = context.get_command("textview");
// if let Some(text) = text {
// let result = text.run(raw.with_input(input), &context.commands);
// result.collect::<Vec<_>>().await;
// } else {
// for i in input {
// match i.item {
// Value::Primitive(Primitive::String(s)) => {
// println!("{}", s);
// }
// _ => {}
// }
// }
// }
} else if is_single_anchored_text_value(&input) {
let text = context.get_command("textview");
if let Some(text) = text {
let result = text.run(raw.with_input(input), &context.commands, false);
result.collect::<Vec<_>>().await;
} else {
for i in input {
match i.item {
Value::Primitive(Primitive::String(s)) => {
println!("{}", s);
}
_ => {}
}
}
}
} else if is_single_text_value(&input) {
for i in input {
match i.item {
@ -112,7 +112,7 @@ fn is_single_text_value(input: &Vec<Tagged<Value>>) -> bool {
}
#[allow(unused)]
fn is_single_origined_text_value(input: &Vec<Tagged<Value>>) -> bool {
fn is_single_anchored_text_value(input: &Vec<Tagged<Value>>) -> bool {
if input.len() != 1 {
return false;
}

View File

@ -72,6 +72,7 @@ impl ClassifiedInputStream {
}
}
#[derive(Debug)]
pub(crate) struct ClassifiedPipeline {
pub(crate) commands: Vec<ClassifiedCommand>,
}
@ -117,15 +118,19 @@ impl InternalCommand {
let command = context.expect_command(&self.name);
let result = context.run_command(
let result = {
let source_map = context.source_map.lock().unwrap().clone();
context.run_command(
command,
self.name_tag.clone(),
context.source_map.clone(),
source_map,
self.args,
&source,
objects,
is_first_command,
);
)
};
let result = trace_out_stream!(target: "nu::trace_stream::internal", source: &source, "output" = result);
let mut result = result.values;
@ -253,7 +258,11 @@ impl ExternalCommand {
tag,
));
} else {
return Err(ShellError::string("Error: $it needs string data"));
return Err(ShellError::labeled_error(
"Error: $it needs string data",
"given something else",
name_tag,
));
}
}
if !first {

View File

@ -70,9 +70,9 @@ pub fn config(
if let Some(v) = get {
let key = v.to_string();
let value = result
.get(&key)
.ok_or_else(|| ShellError::string(&format!("Missing key {} in config", key)))?;
let value = result.get(&key).ok_or_else(|| {
ShellError::labeled_error(&format!("Missing key in config"), "key", v.tag())
})?;
let mut results = VecDeque::new();
@ -120,10 +120,11 @@ pub fn config(
result.swap_remove(&key);
config::write(&result, &configuration)?;
} else {
return Err(ShellError::string(&format!(
return Err(ShellError::labeled_error(
"{} does not exist in config",
key
)));
"key",
v.tag(),
));
}
let obj = VecDeque::from_iter(vec![Value::Row(result.into()).tagged(v.tag())]);

View File

@ -44,11 +44,13 @@ fn run(
registry: &CommandRegistry,
raw_args: &RawCommandArgs,
) -> Result<OutputStream, ShellError> {
let path = match call_info
.args
.nth(0)
.ok_or_else(|| ShellError::string(&format!("No file or directory specified")))?
{
let path = match call_info.args.nth(0).ok_or_else(|| {
ShellError::labeled_error(
"No file or directory specified",
"for command",
call_info.name_tag,
)
})? {
file => file,
};
let path_buf = path.as_path()?;

View File

@ -45,11 +45,13 @@ fn run(
let cwd = PathBuf::from(shell_manager.path());
let full_path = PathBuf::from(cwd);
let path = match call_info
.args
.nth(0)
.ok_or_else(|| ShellError::string(&format!("No file or directory specified")))?
{
let path = match call_info.args.nth(0).ok_or_else(|| {
ShellError::labeled_error(
"No file or directory specified",
"for command",
call_info.name_tag,
)
})? {
file => file,
};
let path_buf = path.as_path()?;

View File

@ -128,7 +128,7 @@ pub fn filter_plugin(
},
Err(e) => {
let mut result = VecDeque::new();
result.push_back(Err(ShellError::string(format!(
result.push_back(Err(ShellError::untagged_runtime_error(format!(
"Error while processing begin_filter response: {:?} {}",
e, input
))));
@ -138,7 +138,7 @@ pub fn filter_plugin(
}
Err(e) => {
let mut result = VecDeque::new();
result.push_back(Err(ShellError::string(format!(
result.push_back(Err(ShellError::untagged_runtime_error(format!(
"Error while reading begin_filter response: {:?}",
e
))));
@ -189,7 +189,7 @@ pub fn filter_plugin(
},
Err(e) => {
let mut result = VecDeque::new();
result.push_back(Err(ShellError::string(format!(
result.push_back(Err(ShellError::untagged_runtime_error(format!(
"Error while processing end_filter response: {:?} {}",
e, input
))));
@ -199,7 +199,7 @@ pub fn filter_plugin(
}
Err(e) => {
let mut result = VecDeque::new();
result.push_back(Err(ShellError::string(format!(
result.push_back(Err(ShellError::untagged_runtime_error(format!(
"Error while reading end_filter: {:?}",
e
))));
@ -236,7 +236,7 @@ pub fn filter_plugin(
},
Err(e) => {
let mut result = VecDeque::new();
result.push_back(Err(ShellError::string(format!(
result.push_back(Err(ShellError::untagged_runtime_error(format!(
"Error while processing filter response: {:?} {}",
e, input
))));
@ -246,7 +246,7 @@ pub fn filter_plugin(
}
Err(e) => {
let mut result = VecDeque::new();
result.push_back(Err(ShellError::string(format!(
result.push_back(Err(ShellError::untagged_runtime_error(format!(
"Error while reading filter response: {:?}",
e
))));

View File

@ -55,18 +55,14 @@ fn run(
raw_args: &RawCommandArgs,
) -> Result<OutputStream, ShellError> {
let call_info = call_info.clone();
let path = match call_info
.args
.nth(0)
.ok_or_else(|| ShellError::string(&format!("No url specified")))?
{
let path = match call_info.args.nth(0).ok_or_else(|| {
ShellError::labeled_error("No url specified", "for command", call_info.name_tag)
})? {
file => file.clone(),
};
let body = match call_info
.args
.nth(1)
.ok_or_else(|| ShellError::string(&format!("No body specified")))?
{
let body = match call_info.args.nth(1).ok_or_else(|| {
ShellError::labeled_error("No body specified", "for command", call_info.name_tag)
})? {
file => file.clone(),
};
let path_str = path.as_string()?;

View File

@ -150,7 +150,6 @@ fn save(
}
},
None => {
eprintln!("{:?} {:?}", anchor, source_map);
yield Err(ShellError::labeled_error(
"Save requires a filepath (2)",
"needs path",
@ -213,9 +212,9 @@ fn save(
match content {
Ok(save_data) => match std::fs::write(full_path, save_data) {
Ok(o) => o,
Err(e) => yield Err(ShellError::string(e.to_string())),
Err(e) => yield Err(ShellError::labeled_error(e.to_string(), "for command", name)),
},
Err(e) => yield Err(ShellError::string(e.to_string())),
Err(e) => yield Err(ShellError::labeled_error(e.to_string(), "for command", name)),
}
};

View File

@ -32,8 +32,8 @@ impl WholeStreamCommand for ToCSV {
}
}
pub fn value_to_csv_value(v: &Value) -> Value {
match v {
pub fn value_to_csv_value(v: &Tagged<Value>) -> Tagged<Value> {
match &v.item {
Value::Primitive(Primitive::String(s)) => Value::Primitive(Primitive::String(s.clone())),
Value::Primitive(Primitive::Nothing) => Value::Primitive(Primitive::Nothing),
Value::Primitive(Primitive::Boolean(b)) => Value::Primitive(Primitive::Boolean(b.clone())),
@ -47,10 +47,11 @@ pub fn value_to_csv_value(v: &Value) -> Value {
Value::Block(_) => Value::Primitive(Primitive::Nothing),
_ => Value::Primitive(Primitive::Nothing),
}
.tagged(v.tag)
}
fn to_string_helper(v: &Value) -> Result<String, ShellError> {
match v {
fn to_string_helper(v: &Tagged<Value>) -> Result<String, ShellError> {
match &v.item {
Value::Primitive(Primitive::Date(d)) => Ok(d.to_string()),
Value::Primitive(Primitive::Bytes(b)) => Ok(format!("{}", b)),
Value::Primitive(Primitive::Boolean(_)) => Ok(v.as_string()?),
@ -60,7 +61,7 @@ fn to_string_helper(v: &Value) -> Result<String, ShellError> {
Value::Table(_) => return Ok(String::from("[Table]")),
Value::Row(_) => return Ok(String::from("[Row]")),
Value::Primitive(Primitive::String(s)) => return Ok(s.to_string()),
_ => return Err(ShellError::string("Unexpected value")),
_ => return Err(ShellError::labeled_error("Unexpected value", "", v.tag)),
}
}
@ -76,7 +77,9 @@ fn merge_descriptors(values: &[Tagged<Value>]) -> Vec<String> {
ret
}
pub fn to_string(v: &Value) -> Result<String, ShellError> {
pub fn to_string(tagged_value: &Tagged<Value>) -> Result<String, ShellError> {
let v = &tagged_value.item;
match v {
Value::Row(o) => {
let mut wtr = WriterBuilder::new().from_writer(vec![]);
@ -92,11 +95,20 @@ pub fn to_string(v: &Value) -> Result<String, ShellError> {
wtr.write_record(fields).expect("can not write.");
wtr.write_record(values).expect("can not write.");
return Ok(String::from_utf8(
wtr.into_inner()
.map_err(|_| ShellError::string("Could not convert record"))?,
return Ok(String::from_utf8(wtr.into_inner().map_err(|_| {
ShellError::labeled_error(
"Could not convert record",
"original value",
tagged_value.tag,
)
.map_err(|_| ShellError::string("Could not convert record"))?);
})?)
.map_err(|_| {
ShellError::labeled_error(
"Could not convert record",
"original value",
tagged_value.tag,
)
})?);
}
Value::Table(list) => {
let mut wtr = WriterBuilder::new().from_writer(vec![]);
@ -120,13 +132,22 @@ pub fn to_string(v: &Value) -> Result<String, ShellError> {
wtr.write_record(&row).expect("can not write");
}
return Ok(String::from_utf8(
wtr.into_inner()
.map_err(|_| ShellError::string("Could not convert record"))?,
return Ok(String::from_utf8(wtr.into_inner().map_err(|_| {
ShellError::labeled_error(
"Could not convert record",
"original value",
tagged_value.tag,
)
.map_err(|_| ShellError::string("Could not convert record"))?);
})?)
.map_err(|_| {
ShellError::labeled_error(
"Could not convert record",
"original value",
tagged_value.tag,
)
})?);
}
_ => return to_string_helper(&v),
_ => return to_string_helper(tagged_value),
}
}
@ -148,7 +169,7 @@ fn to_csv(
};
for value in to_process_input {
match to_string(&value_to_csv_value(&value.item)) {
match to_string(&value_to_csv_value(&value)) {
Ok(x) => {
let converted = if headerless {
x.lines().skip(1).collect()

View File

@ -32,7 +32,9 @@ impl WholeStreamCommand for ToTSV {
}
}
pub fn value_to_tsv_value(v: &Value) -> Value {
pub fn value_to_tsv_value(tagged_value: &Tagged<Value>) -> Tagged<Value> {
let v = &tagged_value.item;
match v {
Value::Primitive(Primitive::String(s)) => Value::Primitive(Primitive::String(s.clone())),
Value::Primitive(Primitive::Nothing) => Value::Primitive(Primitive::Nothing),
@ -47,20 +49,28 @@ pub fn value_to_tsv_value(v: &Value) -> Value {
Value::Block(_) => Value::Primitive(Primitive::Nothing),
_ => Value::Primitive(Primitive::Nothing),
}
.tagged(tagged_value.tag)
}
fn to_string_helper(v: &Value) -> Result<String, ShellError> {
fn to_string_helper(tagged_value: &Tagged<Value>) -> Result<String, ShellError> {
let v = &tagged_value.item;
match v {
Value::Primitive(Primitive::Date(d)) => Ok(d.to_string()),
Value::Primitive(Primitive::Bytes(b)) => Ok(format!("{}", b)),
Value::Primitive(Primitive::Boolean(_)) => Ok(v.as_string()?),
Value::Primitive(Primitive::Decimal(_)) => Ok(v.as_string()?),
Value::Primitive(Primitive::Int(_)) => Ok(v.as_string()?),
Value::Primitive(Primitive::Path(_)) => Ok(v.as_string()?),
Value::Primitive(Primitive::Boolean(_)) => Ok(tagged_value.as_string()?),
Value::Primitive(Primitive::Decimal(_)) => Ok(tagged_value.as_string()?),
Value::Primitive(Primitive::Int(_)) => Ok(tagged_value.as_string()?),
Value::Primitive(Primitive::Path(_)) => Ok(tagged_value.as_string()?),
Value::Table(_) => return Ok(String::from("[table]")),
Value::Row(_) => return Ok(String::from("[row]")),
Value::Primitive(Primitive::String(s)) => return Ok(s.to_string()),
_ => return Err(ShellError::string("Unexpected value")),
_ => {
return Err(ShellError::labeled_error(
"Unexpected value",
"original value",
tagged_value.tag,
))
}
}
}
@ -76,7 +86,9 @@ fn merge_descriptors(values: &[Tagged<Value>]) -> Vec<String> {
ret
}
pub fn to_string(v: &Value) -> Result<String, ShellError> {
pub fn to_string(tagged_value: &Tagged<Value>) -> Result<String, ShellError> {
let v = &tagged_value.item;
match v {
Value::Row(o) => {
let mut wtr = WriterBuilder::new().delimiter(b'\t').from_writer(vec![]);
@ -91,11 +103,20 @@ pub fn to_string(v: &Value) -> Result<String, ShellError> {
wtr.write_record(fields).expect("can not write.");
wtr.write_record(values).expect("can not write.");
return Ok(String::from_utf8(
wtr.into_inner()
.map_err(|_| ShellError::string("Could not convert record"))?,
return Ok(String::from_utf8(wtr.into_inner().map_err(|_| {
ShellError::labeled_error(
"Could not convert record",
"original value",
tagged_value.tag,
)
.map_err(|_| ShellError::string("Could not convert record"))?);
})?)
.map_err(|_| {
ShellError::labeled_error(
"Could not convert record",
"original value",
tagged_value.tag,
)
})?);
}
Value::Table(list) => {
let mut wtr = WriterBuilder::new().delimiter(b'\t').from_writer(vec![]);
@ -119,13 +140,22 @@ pub fn to_string(v: &Value) -> Result<String, ShellError> {
wtr.write_record(&row).expect("can not write");
}
return Ok(String::from_utf8(
wtr.into_inner()
.map_err(|_| ShellError::string("Could not convert record"))?,
return Ok(String::from_utf8(wtr.into_inner().map_err(|_| {
ShellError::labeled_error(
"Could not convert record",
"original value",
tagged_value.tag,
)
.map_err(|_| ShellError::string("Could not convert record"))?);
})?)
.map_err(|_| {
ShellError::labeled_error(
"Could not convert record",
"original value",
tagged_value.tag,
)
})?);
}
_ => return to_string_helper(&v),
_ => return to_string_helper(tagged_value),
}
}
@ -147,7 +177,7 @@ fn to_tsv(
};
for value in to_process_input {
match to_string(&value_to_tsv_value(&value.item)) {
match to_string(&value_to_tsv_value(&value)) {
Ok(x) => {
let converted = if headerless {
x.lines().skip(1).collect()

View File

@ -7,7 +7,7 @@ use indexmap::IndexMap;
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::error::Error;
use std::sync::Arc;
use std::sync::{Arc, Mutex};
use uuid::Uuid;
#[derive(Clone, Debug, Serialize, Deserialize)]
@ -77,7 +77,7 @@ impl CommandRegistry {
#[derive(Clone)]
pub struct Context {
registry: CommandRegistry,
pub(crate) source_map: SourceMap,
pub(crate) source_map: Arc<Mutex<SourceMap>>,
host: Arc<Mutex<dyn Host + Send>>,
pub(crate) shell_manager: ShellManager,
}
@ -99,7 +99,7 @@ impl Context {
let registry = CommandRegistry::new();
Ok(Context {
registry: registry.clone(),
source_map: SourceMap::new(),
source_map: Arc::new(Mutex::new(SourceMap::new())),
host: Arc::new(Mutex::new(crate::env::host::BasicHost)),
shell_manager: ShellManager::basic(registry)?,
})
@ -118,7 +118,9 @@ impl Context {
}
pub fn add_anchor_location(&mut self, uuid: Uuid, anchor_location: AnchorLocation) {
self.source_map.insert(uuid, anchor_location);
let mut source_map = self.source_map.lock().unwrap();
source_map.insert(uuid, anchor_location);
}
pub(crate) fn get_command(&self, name: &str) -> Option<Arc<Command>> {

View File

@ -298,7 +298,7 @@ impl fmt::Debug for ValueDebug<'_> {
}
impl Tagged<Value> {
pub(crate) fn tagged_type_name(&self) -> Tagged<String> {
pub fn tagged_type_name(&self) -> Tagged<String> {
let name = self.type_name();
Tagged::from_item(name, self.tag())
}
@ -424,10 +424,27 @@ impl Tagged<Value> {
Ok(out.tagged(self.tag))
}
pub(crate) fn as_string(&self) -> Result<String, ShellError> {
match &self.item {
Value::Primitive(Primitive::String(s)) => Ok(s.clone()),
Value::Primitive(Primitive::Boolean(x)) => Ok(format!("{}", x)),
Value::Primitive(Primitive::Decimal(x)) => Ok(format!("{}", x)),
Value::Primitive(Primitive::Int(x)) => Ok(format!("{}", x)),
Value::Primitive(Primitive::Bytes(x)) => Ok(format!("{}", x)),
Value::Primitive(Primitive::Path(x)) => Ok(format!("{}", x.display())),
// TODO: this should definitely be more general with better errors
other => Err(ShellError::labeled_error(
"Expected string",
other.type_name(),
self.tag,
)),
}
}
}
impl Value {
pub(crate) fn type_name(&self) -> String {
pub fn type_name(&self) -> String {
match self {
Value::Primitive(p) => p.type_name(),
Value::Row(_) => format!("row"),
@ -738,22 +755,6 @@ impl Value {
}
}
pub(crate) fn as_string(&self) -> Result<String, ShellError> {
match self {
Value::Primitive(Primitive::String(s)) => Ok(s.clone()),
Value::Primitive(Primitive::Boolean(x)) => Ok(format!("{}", x)),
Value::Primitive(Primitive::Decimal(x)) => Ok(format!("{}", x)),
Value::Primitive(Primitive::Int(x)) => Ok(format!("{}", x)),
Value::Primitive(Primitive::Bytes(x)) => Ok(format!("{}", x)),
Value::Primitive(Primitive::Path(x)) => Ok(format!("{}", x.display())),
// TODO: this should definitely be more general with better errors
other => Err(ShellError::string(format!(
"Expected string, got {:?}",
other
))),
}
}
pub(crate) fn is_true(&self) -> bool {
match self {
Value::Primitive(Primitive::Boolean(true)) => true,
@ -806,9 +807,14 @@ impl Value {
Value::Primitive(Primitive::Date(s.into()))
}
pub fn date_from_str(s: &str) -> Result<Value, ShellError> {
let date = DateTime::parse_from_rfc3339(s)
.map_err(|err| ShellError::string(&format!("Date parse error: {}", err)))?;
pub fn date_from_str(s: Tagged<&str>) -> Result<Value, ShellError> {
let date = DateTime::parse_from_rfc3339(s.item).map_err(|err| {
ShellError::labeled_error(
&format!("Date parse error: {}", err),
"original value",
s.tag,
)
})?;
let date = date.with_timezone(&chrono::offset::Utc);

View File

@ -51,8 +51,9 @@ pub fn user_data() -> Result<PathBuf, ShellError> {
}
pub fn app_path(app_data_type: AppDataType, display: &str) -> Result<PathBuf, ShellError> {
let path = app_root(app_data_type, &APP_INFO)
.map_err(|err| ShellError::string(&format!("Couldn't open {} path:\n{}", display, err)))?;
let path = app_root(app_data_type, &APP_INFO).map_err(|err| {
ShellError::untagged_runtime_error(&format!("Couldn't open {} path:\n{}", display, err))
})?;
Ok(path)
}
@ -75,10 +76,21 @@ pub fn read(
let tag = tag.into();
let contents = fs::read_to_string(filename)
.map(|v| v.tagged(tag))
.map_err(|err| ShellError::string(&format!("Couldn't read config file:\n{}", err)))?;
.map_err(|err| {
ShellError::labeled_error(
&format!("Couldn't read config file:\n{}", err),
"file name",
tag,
)
})?;
let parsed: toml::Value = toml::from_str(&contents)
.map_err(|err| ShellError::string(&format!("Couldn't parse config file:\n{}", err)))?;
let parsed: toml::Value = toml::from_str(&contents).map_err(|err| {
ShellError::labeled_error(
&format!("Couldn't parse config file:\n{}", err),
"file name",
tag,
)
})?;
let value = convert_toml_value_to_nu_value(&parsed, tag);
let tag = value.tag();

View File

@ -240,6 +240,16 @@ impl Tag {
}
}
pub fn for_char(pos: usize, anchor: Uuid) -> Tag {
Tag {
anchor,
span: Span {
start: pos,
end: pos + 1,
},
}
}
pub fn unknown_span(anchor: Uuid) -> Tag {
Tag {
anchor,
@ -267,6 +277,24 @@ impl Tag {
}
}
pub fn until_option(&self, other: Option<impl Into<Tag>>) -> Tag {
match other {
Some(other) => {
let other = other.into();
debug_assert!(
self.anchor == other.anchor,
"Can only merge two tags with the same anchor"
);
Tag {
span: Span::new(self.span.start, other.span.end),
anchor: self.anchor,
}
}
None => *self,
}
}
pub fn slice<'a>(&self, source: &'a str) -> &'a str {
self.span.slice(source)
}
@ -284,6 +312,7 @@ impl Tag {
}
}
#[allow(unused)]
pub fn tag_for_tagged_list(mut iter: impl Iterator<Item = Tag>) -> Tag {
let first = iter.next();

View File

@ -20,6 +20,14 @@ impl Description {
Description::Synthetic(s) => Err(s),
}
}
#[allow(unused)]
fn tag(&self) -> Tag {
match self {
Description::Source(tagged) => tagged.tag,
Description::Synthetic(_) => Tag::unknown(),
}
}
}
#[derive(Debug, Eq, PartialEq, Clone, Ord, PartialOrd, Serialize, Deserialize)]
@ -36,6 +44,13 @@ pub struct ShellError {
cause: Option<Box<ProximateShellError>>,
}
impl ShellError {
#[allow(unused)]
pub(crate) fn tag(&self) -> Option<Tag> {
self.error.tag()
}
}
impl ToDebug for ShellError {
fn fmt_debug(&self, f: &mut fmt::Formatter, source: &str) -> fmt::Result {
self.error.fmt_debug(f, source)
@ -47,12 +62,12 @@ impl serde::de::Error for ShellError {
where
T: std::fmt::Display,
{
ShellError::string(msg.to_string())
ShellError::untagged_runtime_error(msg.to_string())
}
}
impl ShellError {
pub(crate) fn type_error(
pub fn type_error(
expected: impl Into<String>,
actual: Tagged<impl Into<String>>,
) -> ShellError {
@ -63,6 +78,13 @@ impl ShellError {
.start()
}
pub fn untagged_runtime_error(error: impl Into<String>) -> ShellError {
ProximateShellError::UntaggedRuntimeError {
reason: error.into(),
}
.start()
}
pub(crate) fn unexpected_eof(expected: impl Into<String>, tag: Tag) -> ShellError {
ProximateShellError::UnexpectedEof {
expected: expected.into(),
@ -174,9 +196,6 @@ impl ShellError {
pub(crate) fn to_diagnostic(self) -> Diagnostic<Tag> {
match self.error {
ProximateShellError::String(StringError { title, .. }) => {
Diagnostic::new(Severity::Error, title)
}
ProximateShellError::InvalidCommand { command } => {
Diagnostic::new(Severity::Error, "Invalid command")
.with_label(Label::new_primary(command))
@ -286,7 +305,7 @@ impl ShellError {
} => Diagnostic::new(Severity::Error, "Syntax Error")
.with_label(Label::new_primary(tag).with_message(item)),
ProximateShellError::MissingProperty { subpath, expr } => {
ProximateShellError::MissingProperty { subpath, expr, .. } => {
let subpath = subpath.into_label();
let expr = expr.into_label();
@ -310,6 +329,8 @@ impl ShellError {
.with_label(Label::new_primary(left.tag()).with_message(left.item))
.with_label(Label::new_secondary(right.tag()).with_message(right.item))
}
ProximateShellError::UntaggedRuntimeError { reason } => Diagnostic::new(Severity::Error, format!("Error: {}", reason))
}
}
@ -343,20 +364,16 @@ impl ShellError {
)
}
pub fn string(title: impl Into<String>) -> ShellError {
ProximateShellError::String(StringError::new(title.into(), Value::nothing())).start()
}
pub(crate) fn unimplemented(title: impl Into<String>) -> ShellError {
ShellError::string(&format!("Unimplemented: {}", title.into()))
ShellError::untagged_runtime_error(&format!("Unimplemented: {}", title.into()))
}
pub(crate) fn unexpected(title: impl Into<String>) -> ShellError {
ShellError::string(&format!("Unexpected: {}", title.into()))
ShellError::untagged_runtime_error(&format!("Unexpected: {}", title.into()))
}
pub(crate) fn unreachable(title: impl Into<String>) -> ShellError {
ShellError::string(&format!("BUG: Unreachable: {}", title.into()))
ShellError::untagged_runtime_error(&format!("BUG: Unreachable: {}", title.into()))
}
}
@ -401,7 +418,6 @@ impl ExpectedRange {
#[derive(Debug, Eq, PartialEq, Clone, Ord, PartialOrd, Serialize, Deserialize)]
pub enum ProximateShellError {
String(StringError),
SyntaxError {
problem: Tagged<String>,
},
@ -419,6 +435,7 @@ pub enum ProximateShellError {
MissingProperty {
subpath: Description,
expr: Description,
tag: Tag,
},
MissingValue {
tag: Option<Tag>,
@ -439,6 +456,9 @@ pub enum ProximateShellError {
left: Tagged<String>,
right: Tagged<String>,
},
UntaggedRuntimeError {
reason: String,
},
}
impl ProximateShellError {
@ -448,6 +468,22 @@ impl ProximateShellError {
error: self,
}
}
pub(crate) fn tag(&self) -> Option<Tag> {
Some(match self {
ProximateShellError::SyntaxError { problem } => problem.tag(),
ProximateShellError::UnexpectedEof { tag, .. } => *tag,
ProximateShellError::InvalidCommand { command } => *command,
ProximateShellError::TypeError { actual, .. } => actual.tag,
ProximateShellError::MissingProperty { tag, .. } => *tag,
ProximateShellError::MissingValue { tag, .. } => return *tag,
ProximateShellError::ArgumentError { tag, .. } => *tag,
ProximateShellError::RangeError { actual_kind, .. } => actual_kind.tag,
ProximateShellError::Diagnostic(..) => return None,
ProximateShellError::UntaggedRuntimeError { .. } => return None,
ProximateShellError::CoerceError { left, right } => left.tag.until(right.tag),
})
}
}
impl ToDebug for ProximateShellError {
@ -491,7 +527,6 @@ pub struct StringError {
impl std::fmt::Display for ShellError {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
match &self.error {
ProximateShellError::String(s) => write!(f, "{}", &s.title),
ProximateShellError::MissingValue { .. } => write!(f, "MissingValue"),
ProximateShellError::InvalidCommand { .. } => write!(f, "InvalidCommand"),
ProximateShellError::TypeError { .. } => write!(f, "TypeError"),
@ -502,6 +537,7 @@ impl std::fmt::Display for ShellError {
ProximateShellError::ArgumentError { .. } => write!(f, "ArgumentError"),
ProximateShellError::Diagnostic(_) => write!(f, "<diagnostic>"),
ProximateShellError::CoerceError { .. } => write!(f, "CoerceError"),
ProximateShellError::UntaggedRuntimeError { .. } => write!(f, "UntaggedRuntimeError"),
}
}
}
@ -510,71 +546,43 @@ impl std::error::Error for ShellError {}
impl std::convert::From<Box<dyn std::error::Error>> for ShellError {
fn from(input: Box<dyn std::error::Error>) -> ShellError {
ProximateShellError::String(StringError {
title: format!("{}", input),
error: Value::nothing(),
})
.start()
ShellError::untagged_runtime_error(format!("{}", input))
}
}
impl std::convert::From<std::io::Error> for ShellError {
fn from(input: std::io::Error) -> ShellError {
ProximateShellError::String(StringError {
title: format!("{}", input),
error: Value::nothing(),
})
.start()
ShellError::untagged_runtime_error(format!("{}", input))
}
}
impl std::convert::From<subprocess::PopenError> for ShellError {
fn from(input: subprocess::PopenError) -> ShellError {
ProximateShellError::String(StringError {
title: format!("{}", input),
error: Value::nothing(),
})
.start()
ShellError::untagged_runtime_error(format!("{}", input))
}
}
impl std::convert::From<serde_yaml::Error> for ShellError {
fn from(input: serde_yaml::Error) -> ShellError {
ProximateShellError::String(StringError {
title: format!("{:?}", input),
error: Value::nothing(),
})
.start()
ShellError::untagged_runtime_error(format!("{:?}", input))
}
}
impl std::convert::From<toml::ser::Error> for ShellError {
fn from(input: toml::ser::Error) -> ShellError {
ProximateShellError::String(StringError {
title: format!("{:?}", input),
error: Value::nothing(),
})
.start()
ShellError::untagged_runtime_error(format!("{:?}", input))
}
}
impl std::convert::From<serde_json::Error> for ShellError {
fn from(input: serde_json::Error) -> ShellError {
ProximateShellError::String(StringError {
title: format!("{:?}", input),
error: Value::nothing(),
})
.start()
ShellError::untagged_runtime_error(format!("{:?}", input))
}
}
impl std::convert::From<Box<dyn std::error::Error + Send + Sync>> for ShellError {
fn from(input: Box<dyn std::error::Error + Send + Sync>) -> ShellError {
ProximateShellError::String(StringError {
title: format!("{:?}", input),
error: Value::nothing(),
})
.start()
ShellError::untagged_runtime_error(format!("{:?}", input))
}
}

View File

@ -7,18 +7,18 @@ pub(crate) mod registry;
use crate::errors::ShellError;
pub(crate) use deserializer::ConfigDeserializer;
pub(crate) use hir::syntax_shape::flat_shape::FlatShape;
pub(crate) use hir::TokensIterator;
pub(crate) use parse::call_node::CallNode;
pub(crate) use parse::files::Files;
pub(crate) use parse::flag::Flag;
pub(crate) use parse::flag::{Flag, FlagKind};
pub(crate) use parse::operator::Operator;
pub(crate) use parse::parser::{nom_input, pipeline};
pub(crate) use parse::pipeline::{Pipeline, PipelineElement};
pub(crate) use parse::text::Text;
pub(crate) use parse::token_tree::{DelimitedNode, Delimiter, TokenNode};
pub(crate) use parse::tokens::{RawToken, Token};
pub(crate) use parse::tokens::{RawNumber, RawToken};
pub(crate) use parse::unit::Unit;
pub(crate) use parse_command::parse_command_tail;
pub(crate) use registry::CommandRegistry;
pub fn parse(input: &str, anchor: uuid::Uuid) -> Result<TokenNode, ShellError> {

View File

@ -1,5 +1,11 @@
use crate::errors::ShellError;
use crate::parser::{TokenNode, TokensIterator};
use crate::parser::{
hir::syntax_shape::{
color_syntax, expand_atom, AtomicToken, ColorSyntax, ExpandContext, ExpansionRule,
MaybeSpaceShape,
},
FlatShape, TokenNode, TokensIterator,
};
use crate::{Tag, Tagged, Text};
pub fn expand_external_tokens(
@ -19,6 +25,34 @@ pub fn expand_external_tokens(
Ok(out)
}
#[derive(Debug, Copy, Clone)]
pub struct ExternalTokensShape;
impl ColorSyntax for ExternalTokensShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Self::Info {
loop {
// Allow a space
color_syntax(&MaybeSpaceShape, token_nodes, context, shapes);
// Process an external expression. External expressions are mostly words, with a
// few exceptions (like $variables and path expansion rules)
match color_syntax(&ExternalExpression, token_nodes, context, shapes).1 {
ExternalExpressionResult::Eof => break,
ExternalExpressionResult::Processed => continue,
}
}
}
}
pub fn expand_next_expression(
token_nodes: &mut TokensIterator<'_>,
) -> Result<Option<Tag>, ShellError> {
@ -48,16 +82,15 @@ pub fn expand_next_expression(
fn triage_external_head(node: &TokenNode) -> Result<Tag, ShellError> {
Ok(match node {
TokenNode::Token(token) => token.tag(),
TokenNode::Call(_call) => unimplemented!(),
TokenNode::Nodes(_nodes) => unimplemented!(),
TokenNode::Delimited(_delimited) => unimplemented!(),
TokenNode::Pipeline(_pipeline) => unimplemented!(),
TokenNode::Call(_call) => unimplemented!("TODO: OMG"),
TokenNode::Nodes(_nodes) => unimplemented!("TODO: OMG"),
TokenNode::Delimited(_delimited) => unimplemented!("TODO: OMG"),
TokenNode::Pipeline(_pipeline) => unimplemented!("TODO: OMG"),
TokenNode::Flag(flag) => flag.tag(),
TokenNode::Member(member) => *member,
TokenNode::Whitespace(_whitespace) => {
unreachable!("This function should be called after next_non_ws()")
}
TokenNode::Error(_error) => unimplemented!(),
TokenNode::Error(_error) => unimplemented!("TODO: OMG"),
})
}
@ -73,7 +106,7 @@ fn triage_continuation<'a, 'b>(
match &node {
node if node.is_whitespace() => return Ok(None),
TokenNode::Token(..) | TokenNode::Flag(..) | TokenNode::Member(..) => {}
TokenNode::Token(..) | TokenNode::Flag(..) => {}
TokenNode::Call(..) => unimplemented!("call"),
TokenNode::Nodes(..) => unimplemented!("nodes"),
TokenNode::Delimited(..) => unimplemented!("delimited"),
@ -85,3 +118,42 @@ fn triage_continuation<'a, 'b>(
peeked.commit();
Ok(Some(node.tag()))
}
#[must_use]
enum ExternalExpressionResult {
Eof,
Processed,
}
#[derive(Debug, Copy, Clone)]
struct ExternalExpression;
impl ColorSyntax for ExternalExpression {
type Info = ExternalExpressionResult;
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> ExternalExpressionResult {
let atom = match expand_atom(
token_nodes,
"external word",
context,
ExpansionRule::permissive(),
) {
Err(_) => unreachable!("TODO: separate infallible expand_atom"),
Ok(Tagged {
item: AtomicToken::Eof { .. },
..
}) => return ExternalExpressionResult::Eof,
Ok(atom) => atom,
};
atom.color_tokens(shapes);
return ExternalExpressionResult::Processed;
}
}

View File

@ -1,34 +1,45 @@
mod block;
mod expression;
pub(crate) mod flat_shape;
use crate::cli::external_command;
use crate::commands::{classified::InternalCommand, ClassifiedCommand, Command};
use crate::commands::{
classified::{ClassifiedPipeline, InternalCommand},
ClassifiedCommand, Command,
};
use crate::parser::hir::expand_external_tokens::ExternalTokensShape;
use crate::parser::hir::syntax_shape::block::AnyBlockShape;
use crate::parser::hir::tokens_iterator::Peeked;
use crate::parser::parse_command::parse_command_tail;
use crate::parser::parse_command::{parse_command_tail, CommandTailShape};
use crate::parser::PipelineElement;
use crate::parser::{
hir,
hir::{debug_tokens, TokensIterator},
Operator, RawToken, TokenNode,
Operator, Pipeline, RawToken, TokenNode,
};
use crate::prelude::*;
use derive_new::new;
use getset::Getters;
use log::trace;
use log::{self, log_enabled, trace};
use serde::{Deserialize, Serialize};
use std::path::{Path, PathBuf};
pub(crate) use self::expression::atom::{expand_atom, AtomicToken, ExpansionRule};
pub(crate) use self::expression::delimited::{
color_delimited_square, expand_delimited_square, DelimitedShape,
};
pub(crate) use self::expression::file_path::FilePathShape;
pub(crate) use self::expression::list::ExpressionListShape;
pub(crate) use self::expression::list::{BackoffColoringMode, ExpressionListShape};
pub(crate) use self::expression::number::{IntShape, NumberShape};
pub(crate) use self::expression::pattern::PatternShape;
pub(crate) use self::expression::pattern::{BarePatternShape, PatternShape};
pub(crate) use self::expression::string::StringShape;
pub(crate) use self::expression::unit::UnitShape;
pub(crate) use self::expression::variable_path::{
ColumnPathShape, DotShape, ExpressionContinuation, ExpressionContinuationShape, MemberShape,
PathTailShape, VariablePathShape,
ColorableDotShape, ColumnPathShape, DotShape, ExpressionContinuation,
ExpressionContinuationShape, MemberShape, PathTailShape, VariablePathShape,
};
pub(crate) use self::expression::{continue_expression, AnyExpressionShape};
pub(crate) use self::flat_shape::FlatShape;
#[derive(Debug, Copy, Clone, Serialize, Deserialize)]
pub enum SyntaxShape {
@ -41,9 +52,56 @@ pub enum SyntaxShape {
Int,
Path,
Pattern,
Binary,
Block,
Boolean,
}
impl FallibleColorSyntax for SyntaxShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
match self {
SyntaxShape::Any => {
color_fallible_syntax(&AnyExpressionShape, token_nodes, context, shapes)
}
SyntaxShape::List => {
color_syntax(&ExpressionListShape, token_nodes, context, shapes);
Ok(())
}
SyntaxShape::Int => color_fallible_syntax(&IntShape, token_nodes, context, shapes),
SyntaxShape::String => color_fallible_syntax_with(
&StringShape,
&FlatShape::String,
token_nodes,
context,
shapes,
),
SyntaxShape::Member => {
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)
}
SyntaxShape::ColumnPath => {
color_fallible_syntax(&ColumnPathShape, token_nodes, context, shapes)
}
SyntaxShape::Number => {
color_fallible_syntax(&NumberShape, token_nodes, context, shapes)
}
SyntaxShape::Path => {
color_fallible_syntax(&FilePathShape, token_nodes, context, shapes)
}
SyntaxShape::Pattern => {
color_fallible_syntax(&PatternShape, token_nodes, context, shapes)
}
SyntaxShape::Block => {
color_fallible_syntax(&AnyBlockShape, token_nodes, context, shapes)
}
}
}
}
impl ExpandExpression for SyntaxShape {
@ -73,9 +131,7 @@ impl ExpandExpression for SyntaxShape {
SyntaxShape::Number => expand_expr(&NumberShape, token_nodes, context),
SyntaxShape::Path => expand_expr(&FilePathShape, token_nodes, context),
SyntaxShape::Pattern => expand_expr(&PatternShape, token_nodes, context),
SyntaxShape::Binary => Err(ShellError::unimplemented("SyntaxShape:Binary")),
SyntaxShape::Block => expand_expr(&AnyBlockShape, token_nodes, context),
SyntaxShape::Boolean => Err(ShellError::unimplemented("SyntaxShape:Boolean")),
}
}
}
@ -92,9 +148,7 @@ impl std::fmt::Display for SyntaxShape {
SyntaxShape::Number => write!(f, "Number"),
SyntaxShape::Path => write!(f, "Path"),
SyntaxShape::Pattern => write!(f, "Pattern"),
SyntaxShape::Binary => write!(f, "Binary"),
SyntaxShape::Block => write!(f, "Block"),
SyntaxShape::Boolean => write!(f, "Boolean"),
}
}
}
@ -148,6 +202,50 @@ pub trait ExpandExpression: std::fmt::Debug + Copy {
) -> Result<hir::Expression, ShellError>;
}
pub trait FallibleColorSyntax: std::fmt::Debug + Copy {
type Info;
type Input;
fn color_syntax<'a, 'b>(
&self,
input: &Self::Input,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<Self::Info, ShellError>;
}
pub trait ColorSyntax: std::fmt::Debug + Copy {
type Info;
type Input;
fn color_syntax<'a, 'b>(
&self,
input: &Self::Input,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Self::Info;
}
// impl<T> ColorSyntax for T
// where
// T: FallibleColorSyntax,
// {
// type Info = Result<T::Info, ShellError>;
// type Input = T::Input;
// fn color_syntax<'a, 'b>(
// &self,
// input: &Self::Input,
// token_nodes: &'b mut TokensIterator<'a>,
// context: &ExpandContext,
// shapes: &mut Vec<Tagged<FlatShape>>,
// ) -> Result<T::Info, ShellError> {
// FallibleColorSyntax::color_syntax(self, input, token_nodes, context, shapes)
// }
// }
pub(crate) trait ExpandSyntax: std::fmt::Debug + Copy {
type Output: std::fmt::Debug;
@ -180,6 +278,130 @@ pub(crate) fn expand_syntax<'a, 'b, T: ExpandSyntax>(
}
}
pub fn color_syntax<'a, 'b, T: ColorSyntax<Info = U, Input = ()>, U>(
shape: &T,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> ((), U) {
trace!(target: "nu::color_syntax", "before {} :: {:?}", std::any::type_name::<T>(), debug_tokens(token_nodes, context.source));
let len = shapes.len();
let result = shape.color_syntax(&(), token_nodes, context, shapes);
trace!(target: "nu::color_syntax", "ok :: {:?}", debug_tokens(token_nodes, context.source));
if log_enabled!(target: "nu::color_syntax", log::Level::Trace) {
trace!(target: "nu::color_syntax", "after {}", std::any::type_name::<T>());
if len < shapes.len() {
for i in len..(shapes.len()) {
trace!(target: "nu::color_syntax", "new shape :: {:?}", shapes[i]);
}
} else {
trace!(target: "nu::color_syntax", "no new shapes");
}
}
((), result)
}
pub fn color_fallible_syntax<'a, 'b, T: FallibleColorSyntax<Info = U, Input = ()>, U>(
shape: &T,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<U, ShellError> {
trace!(target: "nu::color_syntax", "before {} :: {:?}", std::any::type_name::<T>(), debug_tokens(token_nodes, context.source));
if token_nodes.at_end() {
trace!(target: "nu::color_syntax", "at eof");
return Err(ShellError::unexpected_eof("coloring", Tag::unknown()));
}
let len = shapes.len();
let result = shape.color_syntax(&(), token_nodes, context, shapes);
trace!(target: "nu::color_syntax", "ok :: {:?}", debug_tokens(token_nodes, context.source));
if log_enabled!(target: "nu::color_syntax", log::Level::Trace) {
trace!(target: "nu::color_syntax", "after {}", std::any::type_name::<T>());
if len < shapes.len() {
for i in len..(shapes.len()) {
trace!(target: "nu::color_syntax", "new shape :: {:?}", shapes[i]);
}
} else {
trace!(target: "nu::color_syntax", "no new shapes");
}
}
result
}
pub fn color_syntax_with<'a, 'b, T: ColorSyntax<Info = U, Input = I>, U, I>(
shape: &T,
input: &I,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> ((), U) {
trace!(target: "nu::color_syntax", "before {} :: {:?}", std::any::type_name::<T>(), debug_tokens(token_nodes, context.source));
let len = shapes.len();
let result = shape.color_syntax(input, token_nodes, context, shapes);
trace!(target: "nu::color_syntax", "ok :: {:?}", debug_tokens(token_nodes, context.source));
if log_enabled!(target: "nu::color_syntax", log::Level::Trace) {
trace!(target: "nu::color_syntax", "after {}", std::any::type_name::<T>());
if len < shapes.len() {
for i in len..(shapes.len()) {
trace!(target: "nu::color_syntax", "new shape :: {:?}", shapes[i]);
}
} else {
trace!(target: "nu::color_syntax", "no new shapes");
}
}
((), result)
}
pub fn color_fallible_syntax_with<'a, 'b, T: FallibleColorSyntax<Info = U, Input = I>, U, I>(
shape: &T,
input: &I,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<U, ShellError> {
trace!(target: "nu::color_syntax", "before {} :: {:?}", std::any::type_name::<T>(), debug_tokens(token_nodes, context.source));
if token_nodes.at_end() {
trace!(target: "nu::color_syntax", "at eof");
return Err(ShellError::unexpected_eof("coloring", Tag::unknown()));
}
let len = shapes.len();
let result = shape.color_syntax(input, token_nodes, context, shapes);
trace!(target: "nu::color_syntax", "ok :: {:?}", debug_tokens(token_nodes, context.source));
if log_enabled!(target: "nu::color_syntax", log::Level::Trace) {
trace!(target: "nu::color_syntax", "after {}", std::any::type_name::<T>());
if len < shapes.len() {
for i in len..(shapes.len()) {
trace!(target: "nu::color_syntax", "new shape :: {:?}", shapes[i]);
}
} else {
trace!(target: "nu::color_syntax", "no new shapes");
}
}
result
}
pub(crate) fn expand_expr<'a, 'b, T: ExpandExpression>(
shape: &T,
token_nodes: &'b mut TokensIterator<'a>,
@ -314,6 +536,33 @@ impl ExpandSyntax for BarePathShape {
#[derive(Debug, Copy, Clone)]
pub struct BareShape;
impl FallibleColorSyntax for BareShape {
type Info = ();
type Input = FlatShape;
fn color_syntax<'a, 'b>(
&self,
input: &FlatShape,
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
token_nodes.peek_any_token(|token| match token {
// If it's a bare token, color it
TokenNode::Token(Tagged {
item: RawToken::Bare,
tag,
}) => {
shapes.push((*input).tagged(tag));
Ok(())
}
// otherwise, fail
other => Err(ShellError::type_error("word", other.tagged_type_name())),
})
}
}
impl ExpandSyntax for BareShape {
type Output = Tagged<String>;
@ -383,9 +632,129 @@ impl CommandSignature {
}
}
#[derive(Debug, Copy, Clone)]
pub struct PipelineShape;
// The failure mode is if the head of the token stream is not a pipeline
impl FallibleColorSyntax for PipelineShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
// Make sure we're looking at a pipeline
let Pipeline { parts, .. } = token_nodes.peek_any_token(|node| node.as_pipeline())?;
// Enumerate the pipeline parts
for part in parts {
// If the pipeline part has a prefix `|`, emit a pipe to color
if let Some(pipe) = part.pipe {
shapes.push(FlatShape::Pipe.tagged(pipe));
}
// Create a new iterator containing the tokens in the pipeline part to color
let mut token_nodes = TokensIterator::new(&part.tokens.item, part.tag, false);
color_syntax(&MaybeSpaceShape, &mut token_nodes, context, shapes);
color_syntax(&CommandShape, &mut token_nodes, context, shapes);
}
Ok(())
}
}
impl ExpandSyntax for PipelineShape {
type Output = ClassifiedPipeline;
fn expand_syntax<'a, 'b>(
&self,
iterator: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ShellError> {
let source = context.source;
let peeked = iterator.peek_any().not_eof("pipeline")?;
let pipeline = peeked.node.as_pipeline()?;
peeked.commit();
let Pipeline { parts, .. } = pipeline;
let commands: Result<Vec<_>, ShellError> = parts
.iter()
.map(|item| classify_command(&item, context, &source))
.collect();
Ok(ClassifiedPipeline {
commands: commands?,
})
}
}
pub enum CommandHeadKind {
External,
Internal(Signature),
}
#[derive(Debug, Copy, Clone)]
pub struct CommandHeadShape;
impl FallibleColorSyntax for CommandHeadShape {
type Info = CommandHeadKind;
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<CommandHeadKind, ShellError> {
// If we don't ultimately find a token, roll back
token_nodes.atomic(|token_nodes| {
// First, take a look at the next token
let atom = expand_atom(
token_nodes,
"command head",
context,
ExpansionRule::permissive(),
)?;
match atom.item {
// If the head is an explicit external command (^cmd), color it as an external command
AtomicToken::ExternalCommand { command } => {
shapes.push(FlatShape::ExternalCommand.tagged(command));
Ok(CommandHeadKind::External)
}
// If the head is a word, it depends on whether it matches a registered internal command
AtomicToken::Word { text } => {
let name = text.slice(context.source);
if context.registry.has(name) {
// If the registry has the command, color it as an internal command
shapes.push(FlatShape::InternalCommand.tagged(text));
let command = context.registry.expect_command(name);
Ok(CommandHeadKind::Internal(command.signature()))
} else {
// Otherwise, color it as an external command
shapes.push(FlatShape::ExternalCommand.tagged(text));
Ok(CommandHeadKind::External)
}
}
// Otherwise, we're not actually looking at a command
_ => Err(ShellError::syntax_error(
"No command at the head".tagged(atom.tag),
)),
}
})
}
}
impl ExpandSyntax for CommandHeadShape {
type Output = CommandSignature;
@ -395,7 +764,7 @@ impl ExpandSyntax for CommandHeadShape {
context: &ExpandContext,
) -> Result<CommandSignature, ShellError> {
let node =
parse_single_node_skipping_ws(token_nodes, "command head1", |token, token_tag| {
parse_single_node_skipping_ws(token_nodes, "command head1", |token, token_tag, _| {
Ok(match token {
RawToken::ExternalCommand(tag) => CommandSignature::LiteralExternal {
outer: token_tag,
@ -488,6 +857,44 @@ impl ExpandSyntax for ClassifiedCommandShape {
#[derive(Debug, Copy, Clone)]
pub struct InternalCommandHeadShape;
impl FallibleColorSyntax for InternalCommandHeadShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let peeked_head = token_nodes.peek_non_ws().not_eof("command head4");
let peeked_head = match peeked_head {
Err(_) => return Ok(()),
Ok(peeked_head) => peeked_head,
};
let _expr = match peeked_head.node {
TokenNode::Token(Tagged {
item: RawToken::Bare,
tag,
}) => shapes.push(FlatShape::Word.tagged(tag)),
TokenNode::Token(Tagged {
item: RawToken::String(_inner_tag),
tag,
}) => shapes.push(FlatShape::String.tagged(tag)),
_node => shapes.push(FlatShape::Error.tagged(peeked_head.node.tag())),
};
peeked_head.commit();
Ok(())
}
}
impl ExpandExpression for InternalCommandHeadShape {
fn expand_expr(
&self,
@ -523,33 +930,52 @@ impl ExpandExpression for InternalCommandHeadShape {
}
}
pub(crate) struct SingleError<'token> {
expected: &'static str,
node: &'token Tagged<RawToken>,
}
impl<'token> SingleError<'token> {
pub(crate) fn error(&self) -> ShellError {
ShellError::type_error(self.expected, self.node.type_name().tagged(self.node.tag))
}
}
fn parse_single_node<'a, 'b, T>(
token_nodes: &'b mut TokensIterator<'a>,
expected: &'static str,
callback: impl FnOnce(RawToken, Tag) -> Result<T, ShellError>,
callback: impl FnOnce(RawToken, Tag, SingleError) -> Result<T, ShellError>,
) -> Result<T, ShellError> {
let peeked = token_nodes.peek_any().not_eof(expected)?;
token_nodes.peek_any_token(|node| match node {
TokenNode::Token(token) => callback(
token.item,
token.tag(),
SingleError {
expected,
node: token,
},
),
let expr = match peeked.node {
TokenNode::Token(token) => callback(token.item, token.tag())?,
other => return Err(ShellError::type_error(expected, other.tagged_type_name())),
};
peeked.commit();
Ok(expr)
other => Err(ShellError::type_error(expected, other.tagged_type_name())),
})
}
fn parse_single_node_skipping_ws<'a, 'b, T>(
token_nodes: &'b mut TokensIterator<'a>,
expected: &'static str,
callback: impl FnOnce(RawToken, Tag) -> Result<T, ShellError>,
callback: impl FnOnce(RawToken, Tag, SingleError) -> Result<T, ShellError>,
) -> Result<T, ShellError> {
let peeked = token_nodes.peek_non_ws().not_eof(expected)?;
let expr = match peeked.node {
TokenNode::Token(token) => callback(token.item, token.tag())?,
TokenNode::Token(token) => callback(
token.item,
token.tag(),
SingleError {
expected,
node: token,
},
)?,
other => return Err(ShellError::type_error(expected, other.tagged_type_name())),
};
@ -562,6 +988,36 @@ fn parse_single_node_skipping_ws<'a, 'b, T>(
#[derive(Debug, Copy, Clone)]
pub struct WhitespaceShape;
impl FallibleColorSyntax for WhitespaceShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let peeked = token_nodes.peek_any().not_eof("whitespace");
let peeked = match peeked {
Err(_) => return Ok(()),
Ok(peeked) => peeked,
};
let _tag = match peeked.node {
TokenNode::Whitespace(tag) => shapes.push(FlatShape::Whitespace.tagged(tag)),
_other => return Ok(()),
};
peeked.commit();
Ok(())
}
}
impl ExpandSyntax for WhitespaceShape {
type Output = Tag;
@ -626,6 +1082,65 @@ pub struct MaybeSpacedExpression<T: ExpandExpression> {
inner: T,
}
#[derive(Debug, Copy, Clone)]
pub struct MaybeSpaceShape;
impl ColorSyntax for MaybeSpaceShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Self::Info {
let peeked = token_nodes.peek_any().not_eof("whitespace");
let peeked = match peeked {
Err(_) => return,
Ok(peeked) => peeked,
};
if let TokenNode::Whitespace(tag) = peeked.node {
peeked.commit();
shapes.push(FlatShape::Whitespace.tagged(tag));
}
}
}
#[derive(Debug, Copy, Clone)]
pub struct SpaceShape;
impl FallibleColorSyntax for SpaceShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let peeked = token_nodes.peek_any().not_eof("whitespace")?;
match peeked.node {
TokenNode::Whitespace(tag) => {
peeked.commit();
shapes.push(FlatShape::Whitespace.tagged(tag));
Ok(())
}
other => Err(ShellError::type_error(
"whitespace",
other.tagged_type_name(),
)),
}
}
}
impl<T: ExpandExpression> ExpandExpression for MaybeSpacedExpression<T> {
fn expand_expr<'a, 'b>(
&self,
@ -660,3 +1175,87 @@ fn expand_variable(tag: Tag, token_tag: Tag, source: &Text) -> hir::Expression {
hir::Expression::variable(tag, token_tag)
}
}
fn classify_command(
command: &Tagged<PipelineElement>,
context: &ExpandContext,
source: &Text,
) -> Result<ClassifiedCommand, ShellError> {
let mut iterator = TokensIterator::new(&command.tokens.item, command.tag, true);
let head = CommandHeadShape.expand_syntax(&mut iterator, &context)?;
match &head {
CommandSignature::Expression(_) => Err(ShellError::syntax_error(
"Unexpected expression in command position".tagged(command.tag),
)),
// If the command starts with `^`, treat it as an external command no matter what
CommandSignature::External(name) => {
let name_str = name.slice(source);
external_command(&mut iterator, source, name_str.tagged(name))
}
CommandSignature::LiteralExternal { outer, inner } => {
let name_str = inner.slice(source);
external_command(&mut iterator, source, name_str.tagged(outer))
}
CommandSignature::Internal(command) => {
let tail =
parse_command_tail(&command.signature(), &context, &mut iterator, command.tag)?;
let (positional, named) = match tail {
None => (None, None),
Some((positional, named)) => (positional, named),
};
let call = hir::Call {
head: Box::new(head.to_expression()),
positional,
named,
};
Ok(ClassifiedCommand::Internal(InternalCommand::new(
command.name().to_string(),
command.tag,
call,
)))
}
}
}
#[derive(Debug, Copy, Clone)]
pub struct CommandShape;
impl ColorSyntax for CommandShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) {
let kind = color_fallible_syntax(&CommandHeadShape, token_nodes, context, shapes);
match kind {
Err(_) => {
// We didn't find a command, so we'll have to fall back to parsing this pipeline part
// as a blob of undifferentiated expressions
color_syntax(&ExpressionListShape, token_nodes, context, shapes);
}
Ok(CommandHeadKind::External) => {
color_syntax(&ExternalTokensShape, token_nodes, context, shapes);
}
Ok(CommandHeadKind::Internal(signature)) => {
color_syntax_with(&CommandTailShape, &signature, token_nodes, context, shapes);
}
};
}
}

View File

@ -2,10 +2,13 @@ use crate::errors::ShellError;
use crate::parser::{
hir,
hir::syntax_shape::{
continue_expression, expand_expr, expand_syntax, ExpandContext, ExpandExpression,
ExpressionListShape, PathTailShape, VariablePathShape,
color_fallible_syntax, color_syntax_with, continue_expression, expand_expr, expand_syntax,
DelimitedShape, ExpandContext, ExpandExpression, ExpressionContinuationShape,
ExpressionListShape, FallibleColorSyntax, FlatShape, MemberShape, PathTailShape,
VariablePathShape,
},
hir::tokens_iterator::TokensIterator,
parse::token_tree::Delimiter,
RawToken, TokenNode,
};
use crate::{Tag, Tagged, TaggedItem};
@ -13,6 +16,49 @@ use crate::{Tag, Tagged, TaggedItem};
#[derive(Debug, Copy, Clone)]
pub struct AnyBlockShape;
impl FallibleColorSyntax for AnyBlockShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let block = token_nodes.peek_non_ws().not_eof("block");
let block = match block {
Err(_) => return Ok(()),
Ok(block) => block,
};
// is it just a block?
let block = block.node.as_block();
match block {
// If so, color it as a block
Some((children, tags)) => {
let mut token_nodes = TokensIterator::new(children.item, context.tag, false);
color_syntax_with(
&DelimitedShape,
&(Delimiter::Brace, tags.0, tags.1),
&mut token_nodes,
context,
shapes,
);
return Ok(());
}
_ => {}
}
// Otherwise, look for a shorthand block. If none found, fail
color_fallible_syntax(&ShorthandBlock, token_nodes, context, shapes)
}
}
impl ExpandExpression for AnyBlockShape {
fn expand_expr<'a, 'b>(
&self,
@ -25,7 +71,7 @@ impl ExpandExpression for AnyBlockShape {
let block = block.node.as_block();
match block {
Some(block) => {
Some((block, _tags)) => {
let mut iterator = TokensIterator::new(&block.item, context.tag, false);
let exprs = expand_syntax(&ExpressionListShape, &mut iterator, context)?;
@ -42,6 +88,37 @@ impl ExpandExpression for AnyBlockShape {
#[derive(Debug, Copy, Clone)]
pub struct ShorthandBlock;
impl FallibleColorSyntax for ShorthandBlock {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
// Try to find a shorthand head. If none found, fail
color_fallible_syntax(&ShorthandPath, token_nodes, context, shapes)?;
loop {
// Check to see whether there's any continuation after the head expression
let result =
color_fallible_syntax(&ExpressionContinuationShape, token_nodes, context, shapes);
match result {
// if no continuation was found, we're done
Err(_) => break,
// if a continuation was found, look for another one
Ok(_) => continue,
}
}
Ok(())
}
}
impl ExpandExpression for ShorthandBlock {
fn expand_expr<'a, 'b>(
&self,
@ -62,6 +139,50 @@ impl ExpandExpression for ShorthandBlock {
#[derive(Debug, Copy, Clone)]
pub struct ShorthandPath;
impl FallibleColorSyntax for ShorthandPath {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
token_nodes.atomic(|token_nodes| {
let variable = color_fallible_syntax(&VariablePathShape, token_nodes, context, shapes);
match variable {
Ok(_) => {
// if it's a variable path, that's the head part
return Ok(());
}
Err(_) => {
// otherwise, we'll try to find a member path
}
}
// look for a member (`<member>` -> `$it.<member>`)
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)?;
// Now that we've synthesized the head, of the path, proceed to expand the tail of the path
// like any other path.
let tail = color_fallible_syntax(&PathTailShape, token_nodes, context, shapes);
match tail {
Ok(_) => {}
Err(_) => {
// It's ok if there's no path tail; a single member is sufficient
}
}
Ok(())
})
}
}
impl ExpandExpression for ShorthandPath {
fn expand_expr<'a, 'b>(
&self,
@ -92,8 +213,6 @@ impl ExpandExpression for ShorthandPath {
head = hir::Expression::dot_member(head, member);
}
println!("{:?}", head);
Ok(head)
}
}
@ -104,6 +223,49 @@ impl ExpandExpression for ShorthandPath {
#[derive(Debug, Copy, Clone)]
pub struct ShorthandHeadShape;
impl FallibleColorSyntax for ShorthandHeadShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
// A shorthand path must not be at EOF
let peeked = token_nodes.peek_non_ws().not_eof("shorthand path")?;
match peeked.node {
// If the head of a shorthand path is a bare token, it expands to `$it.bare`
TokenNode::Token(Tagged {
item: RawToken::Bare,
tag,
}) => {
peeked.commit();
shapes.push(FlatShape::BareMember.tagged(tag));
Ok(())
}
// If the head of a shorthand path is a string, it expands to `$it."some string"`
TokenNode::Token(Tagged {
item: RawToken::String(_),
tag: outer,
}) => {
peeked.commit();
shapes.push(FlatShape::StringMember.tagged(outer));
Ok(())
}
other => Err(ShellError::type_error(
"shorthand head",
other.tagged_type_name(),
)),
}
}
}
impl ExpandExpression for ShorthandHeadShape {
fn expand_expr<'a, 'b>(
&self,

View File

@ -1,3 +1,4 @@
pub(crate) mod atom;
pub(crate) mod delimited;
pub(crate) mod file_path;
pub(crate) mod list;
@ -8,14 +9,14 @@ pub(crate) mod unit;
pub(crate) mod variable_path;
use crate::parser::hir::syntax_shape::{
expand_expr, expand_syntax, expand_variable, expression::delimited::expand_delimited_expr,
BareShape, DotShape, ExpandContext, ExpandExpression, ExpandSyntax, ExpressionContinuation,
ExpressionContinuationShape, UnitShape,
color_delimited_square, color_fallible_syntax, color_fallible_syntax_with, expand_atom,
expand_delimited_square, expand_expr, expand_syntax, AtomicToken, BareShape, ColorableDotShape,
DotShape, ExpandContext, ExpandExpression, ExpandSyntax, ExpansionRule, ExpressionContinuation,
ExpressionContinuationShape, FallibleColorSyntax, FlatShape,
};
use crate::parser::{
hir,
hir::{Expression, Operator, TokensIterator},
RawToken, Token, TokenNode,
hir::{Expression, TokensIterator},
};
use crate::prelude::*;
use std::path::PathBuf;
@ -36,6 +37,32 @@ impl ExpandExpression for AnyExpressionShape {
}
}
impl FallibleColorSyntax for AnyExpressionShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
// Look for an expression at the cursor
color_fallible_syntax(&AnyExpressionStartShape, token_nodes, context, shapes)?;
match continue_coloring_expression(token_nodes, context, shapes) {
Err(_) => {
// it's fine for there to be no continuation
}
Ok(()) => {}
}
Ok(())
}
}
pub(crate) fn continue_expression(
mut head: hir::Expression,
token_nodes: &mut TokensIterator<'_>,
@ -64,6 +91,30 @@ pub(crate) fn continue_expression(
}
}
pub(crate) fn continue_coloring_expression(
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
// if there's not even one expression continuation, fail
color_fallible_syntax(&ExpressionContinuationShape, token_nodes, context, shapes)?;
loop {
// Check to see whether there's any continuation after the head expression
let result =
color_fallible_syntax(&ExpressionContinuationShape, token_nodes, context, shapes);
match result {
Err(_) => {
// We already saw one continuation, so just return
return Ok(());
}
Ok(_) => {}
}
}
}
#[derive(Debug, Copy, Clone)]
pub struct AnyExpressionStartShape;
@ -73,59 +124,148 @@ impl ExpandExpression for AnyExpressionStartShape {
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
) -> Result<hir::Expression, ShellError> {
let size = expand_expr(&UnitShape, token_nodes, context);
let atom = expand_atom(token_nodes, "expression", context, ExpansionRule::new())?;
match size {
Ok(expr) => return Ok(expr),
Err(_) => {}
}
let peek_next = token_nodes.peek_any().not_eof("expression")?;
let head = match peek_next.node {
TokenNode::Token(token) => match token.item {
RawToken::Bare | RawToken::Operator(Operator::Dot) => {
let start = token.tag;
peek_next.commit();
let end = expand_syntax(&BareTailShape, token_nodes, context)?;
match end {
Some(end) => return Ok(hir::Expression::bare(start.until(end))),
None => return Ok(hir::Expression::bare(start)),
}
}
_ => {
peek_next.commit();
expand_one_context_free_token(*token, context)
}
},
node @ TokenNode::Call(_)
| node @ TokenNode::Nodes(_)
| node @ TokenNode::Pipeline(_)
| node @ TokenNode::Flag(_)
| node @ TokenNode::Member(_)
| node @ TokenNode::Whitespace(_) => {
return Err(ShellError::type_error(
"expression",
node.tagged_type_name(),
match atom.item {
AtomicToken::Size { number, unit } => {
return Ok(hir::Expression::size(
number.to_number(context.source),
unit.item,
atom.tag,
))
}
TokenNode::Delimited(delimited) => {
peek_next.commit();
expand_delimited_expr(delimited, context)
AtomicToken::SquareDelimited { nodes, .. } => {
expand_delimited_square(&nodes, atom.tag, context)
}
TokenNode::Error(error) => return Err(*error.item.clone()),
}?;
AtomicToken::Word { .. } | AtomicToken::Dot { .. } => {
let end = expand_syntax(&BareTailShape, token_nodes, context)?;
Ok(hir::Expression::bare(atom.tag.until_option(end)))
}
Ok(head)
other => return other.tagged(atom.tag).into_hir(context, "expression"),
}
}
}
impl FallibleColorSyntax for AnyExpressionStartShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let atom = token_nodes.spanned(|token_nodes| {
expand_atom(
token_nodes,
"expression",
context,
ExpansionRule::permissive(),
)
});
let atom = match atom {
Tagged {
item: Err(_err),
tag,
} => {
shapes.push(FlatShape::Error.tagged(tag));
return Ok(());
}
Tagged {
item: Ok(value), ..
} => value,
};
match atom.item {
AtomicToken::Size { number, unit } => shapes.push(
FlatShape::Size {
number: number.tag,
unit: unit.tag,
}
.tagged(atom.tag),
),
AtomicToken::SquareDelimited { nodes, tags } => {
color_delimited_square(tags, &nodes, atom.tag, context, shapes)
}
AtomicToken::Word { .. } | AtomicToken::Dot { .. } => {
shapes.push(FlatShape::Word.tagged(atom.tag));
}
_ => atom.color_tokens(shapes),
}
Ok(())
}
}
#[derive(Debug, Copy, Clone)]
pub struct BareTailShape;
impl FallibleColorSyntax for BareTailShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let len = shapes.len();
loop {
let word = color_fallible_syntax_with(
&BareShape,
&FlatShape::Word,
token_nodes,
context,
shapes,
);
match word {
// if a word was found, continue
Ok(_) => continue,
// if a word wasn't found, try to find a dot
Err(_) => {}
}
// try to find a dot
let dot = color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Word,
token_nodes,
context,
shapes,
);
match dot {
// if a dot was found, try to find another word
Ok(_) => continue,
// otherwise, we're done
Err(_) => break,
}
}
if shapes.len() > len {
Ok(())
} else {
Err(ShellError::syntax_error(
"No tokens matched BareTailShape".tagged_unknown(),
))
}
}
}
impl ExpandSyntax for BareTailShape {
type Output = Option<Tag>;
@ -158,29 +298,6 @@ impl ExpandSyntax for BareTailShape {
}
}
fn expand_one_context_free_token<'a, 'b>(
token: Token,
context: &ExpandContext,
) -> Result<hir::Expression, ShellError> {
Ok(match token.item {
RawToken::Number(number) => {
hir::Expression::number(number.to_number(context.source), token.tag)
}
RawToken::Operator(..) => {
return Err(ShellError::syntax_error(
"unexpected operator, expected an expression".tagged(token.tag),
))
}
RawToken::Size(..) => unimplemented!("size"),
RawToken::String(tag) => hir::Expression::string(tag, token.tag),
RawToken::Variable(tag) => expand_variable(tag, token.tag, &context.source),
RawToken::ExternalCommand(_) => unimplemented!(),
RawToken::ExternalWord => unimplemented!(),
RawToken::GlobPattern => hir::Expression::pattern(token.tag),
RawToken::Bare => hir::Expression::string(token.tag, token.tag),
})
}
pub fn expand_file_path(string: &str, context: &ExpandContext) -> PathBuf {
let expanded = shellexpand::tilde_with_context(string, || context.homedir());

View File

@ -0,0 +1,541 @@
use crate::parser::hir::syntax_shape::{
expand_syntax, expression::expand_file_path, parse_single_node, BarePathShape,
BarePatternShape, ExpandContext, UnitShape,
};
use crate::parser::{
hir,
hir::{Expression, RawNumber, TokensIterator},
parse::flag::{Flag, FlagKind},
DelimitedNode, Delimiter, FlatShape, RawToken, TokenNode, Unit,
};
use crate::prelude::*;
#[derive(Debug)]
pub enum AtomicToken<'tokens> {
Eof {
tag: Tag,
},
Error {
error: Tagged<ShellError>,
},
Number {
number: RawNumber,
},
Size {
number: Tagged<RawNumber>,
unit: Tagged<Unit>,
},
String {
body: Tag,
},
ItVariable {
name: Tag,
},
Variable {
name: Tag,
},
ExternalCommand {
command: Tag,
},
ExternalWord {
text: Tag,
},
GlobPattern {
pattern: Tag,
},
FilePath {
path: Tag,
},
Word {
text: Tag,
},
SquareDelimited {
tags: (Tag, Tag),
nodes: &'tokens Vec<TokenNode>,
},
ParenDelimited {
tags: (Tag, Tag),
nodes: &'tokens Vec<TokenNode>,
},
BraceDelimited {
tags: (Tag, Tag),
nodes: &'tokens Vec<TokenNode>,
},
Pipeline {
pipe: Option<Tag>,
elements: Tagged<&'tokens Vec<TokenNode>>,
},
ShorthandFlag {
name: Tag,
},
LonghandFlag {
name: Tag,
},
Dot {
text: Tag,
},
Operator {
text: Tag,
},
Whitespace {
text: Tag,
},
}
pub type TaggedAtomicToken<'tokens> = Tagged<AtomicToken<'tokens>>;
impl<'tokens> TaggedAtomicToken<'tokens> {
pub fn into_hir(
&self,
context: &ExpandContext,
expected: &'static str,
) -> Result<hir::Expression, ShellError> {
Ok(match &self.item {
AtomicToken::Eof { .. } => {
return Err(ShellError::type_error(
expected,
"eof atomic token".tagged(self.tag),
))
}
AtomicToken::Error { .. } => {
return Err(ShellError::type_error(
expected,
"eof atomic token".tagged(self.tag),
))
}
AtomicToken::Operator { .. } => {
return Err(ShellError::type_error(
expected,
"operator".tagged(self.tag),
))
}
AtomicToken::ShorthandFlag { .. } => {
return Err(ShellError::type_error(
expected,
"shorthand flag".tagged(self.tag),
))
}
AtomicToken::LonghandFlag { .. } => {
return Err(ShellError::type_error(expected, "flag".tagged(self.tag)))
}
AtomicToken::Whitespace { .. } => {
return Err(ShellError::unimplemented("whitespace in AtomicToken"))
}
AtomicToken::Dot { .. } => {
return Err(ShellError::type_error(expected, "dot".tagged(self.tag)))
}
AtomicToken::Number { number } => {
Expression::number(number.to_number(context.source), self.tag)
}
AtomicToken::FilePath { path } => Expression::file_path(
expand_file_path(path.slice(context.source), context),
self.tag,
),
AtomicToken::Size { number, unit } => {
Expression::size(number.to_number(context.source), **unit, self.tag)
}
AtomicToken::String { body } => Expression::string(body, self.tag),
AtomicToken::ItVariable { name } => Expression::it_variable(name, self.tag),
AtomicToken::Variable { name } => Expression::variable(name, self.tag),
AtomicToken::ExternalCommand { command } => {
Expression::external_command(command, self.tag)
}
AtomicToken::ExternalWord { text } => Expression::string(text, self.tag),
AtomicToken::GlobPattern { pattern } => Expression::pattern(pattern),
AtomicToken::Word { text } => Expression::string(text, text),
AtomicToken::SquareDelimited { .. } => unimplemented!("into_hir"),
AtomicToken::ParenDelimited { .. } => unimplemented!("into_hir"),
AtomicToken::BraceDelimited { .. } => unimplemented!("into_hir"),
AtomicToken::Pipeline { .. } => unimplemented!("into_hir"),
})
}
pub fn tagged_type_name(&self) -> Tagged<&'static str> {
match &self.item {
AtomicToken::Eof { .. } => "eof",
AtomicToken::Error { .. } => "error",
AtomicToken::Operator { .. } => "operator",
AtomicToken::ShorthandFlag { .. } => "shorthand flag",
AtomicToken::LonghandFlag { .. } => "flag",
AtomicToken::Whitespace { .. } => "whitespace",
AtomicToken::Dot { .. } => "dot",
AtomicToken::Number { .. } => "number",
AtomicToken::FilePath { .. } => "file path",
AtomicToken::Size { .. } => "size",
AtomicToken::String { .. } => "string",
AtomicToken::ItVariable { .. } => "$it",
AtomicToken::Variable { .. } => "variable",
AtomicToken::ExternalCommand { .. } => "external command",
AtomicToken::ExternalWord { .. } => "external word",
AtomicToken::GlobPattern { .. } => "file pattern",
AtomicToken::Word { .. } => "word",
AtomicToken::SquareDelimited { .. } => "array literal",
AtomicToken::ParenDelimited { .. } => "parenthesized expression",
AtomicToken::BraceDelimited { .. } => "block",
AtomicToken::Pipeline { .. } => "pipeline",
}
.tagged(self.tag)
}
pub(crate) fn color_tokens(&self, shapes: &mut Vec<Tagged<FlatShape>>) {
match &self.item {
AtomicToken::Eof { .. } => {}
AtomicToken::Error { .. } => return shapes.push(FlatShape::Error.tagged(self.tag)),
AtomicToken::Operator { .. } => {
return shapes.push(FlatShape::Operator.tagged(self.tag));
}
AtomicToken::ShorthandFlag { .. } => {
return shapes.push(FlatShape::ShorthandFlag.tagged(self.tag));
}
AtomicToken::LonghandFlag { .. } => {
return shapes.push(FlatShape::Flag.tagged(self.tag));
}
AtomicToken::Whitespace { .. } => {
return shapes.push(FlatShape::Whitespace.tagged(self.tag));
}
AtomicToken::FilePath { .. } => return shapes.push(FlatShape::Path.tagged(self.tag)),
AtomicToken::Dot { .. } => return shapes.push(FlatShape::Dot.tagged(self.tag)),
AtomicToken::Number {
number: RawNumber::Decimal(_),
} => {
return shapes.push(FlatShape::Decimal.tagged(self.tag));
}
AtomicToken::Number {
number: RawNumber::Int(_),
} => {
return shapes.push(FlatShape::Int.tagged(self.tag));
}
AtomicToken::Size { number, unit } => {
return shapes.push(
FlatShape::Size {
number: number.tag,
unit: unit.tag,
}
.tagged(self.tag),
);
}
AtomicToken::String { .. } => return shapes.push(FlatShape::String.tagged(self.tag)),
AtomicToken::ItVariable { .. } => {
return shapes.push(FlatShape::ItVariable.tagged(self.tag))
}
AtomicToken::Variable { .. } => {
return shapes.push(FlatShape::Variable.tagged(self.tag))
}
AtomicToken::ExternalCommand { .. } => {
return shapes.push(FlatShape::ExternalCommand.tagged(self.tag));
}
AtomicToken::ExternalWord { .. } => {
return shapes.push(FlatShape::ExternalWord.tagged(self.tag))
}
AtomicToken::GlobPattern { .. } => {
return shapes.push(FlatShape::GlobPattern.tagged(self.tag))
}
AtomicToken::Word { .. } => return shapes.push(FlatShape::Word.tagged(self.tag)),
_ => return shapes.push(FlatShape::Error.tagged(self.tag)),
}
}
}
#[derive(Debug)]
pub enum WhitespaceHandling {
#[allow(unused)]
AllowWhitespace,
RejectWhitespace,
}
#[derive(Debug)]
pub struct ExpansionRule {
pub(crate) allow_external_command: bool,
pub(crate) allow_external_word: bool,
pub(crate) allow_operator: bool,
pub(crate) allow_eof: bool,
pub(crate) treat_size_as_word: bool,
pub(crate) commit_errors: bool,
pub(crate) whitespace: WhitespaceHandling,
}
impl ExpansionRule {
pub fn new() -> ExpansionRule {
ExpansionRule {
allow_external_command: false,
allow_external_word: false,
allow_operator: false,
allow_eof: false,
treat_size_as_word: false,
commit_errors: false,
whitespace: WhitespaceHandling::RejectWhitespace,
}
}
/// The intent of permissive mode is to return an atomic token for every possible
/// input token. This is important for error-correcting parsing, such as the
/// syntax highlighter.
pub fn permissive() -> ExpansionRule {
ExpansionRule {
allow_external_command: true,
allow_external_word: true,
allow_operator: true,
allow_eof: true,
treat_size_as_word: false,
commit_errors: true,
whitespace: WhitespaceHandling::AllowWhitespace,
}
}
#[allow(unused)]
pub fn allow_external_command(mut self) -> ExpansionRule {
self.allow_external_command = true;
self
}
#[allow(unused)]
pub fn allow_operator(mut self) -> ExpansionRule {
self.allow_operator = true;
self
}
#[allow(unused)]
pub fn no_operator(mut self) -> ExpansionRule {
self.allow_operator = false;
self
}
#[allow(unused)]
pub fn no_external_command(mut self) -> ExpansionRule {
self.allow_external_command = false;
self
}
#[allow(unused)]
pub fn allow_external_word(mut self) -> ExpansionRule {
self.allow_external_word = true;
self
}
#[allow(unused)]
pub fn no_external_word(mut self) -> ExpansionRule {
self.allow_external_word = false;
self
}
#[allow(unused)]
pub fn treat_size_as_word(mut self) -> ExpansionRule {
self.treat_size_as_word = true;
self
}
#[allow(unused)]
pub fn commit_errors(mut self) -> ExpansionRule {
self.commit_errors = true;
self
}
#[allow(unused)]
pub fn allow_whitespace(mut self) -> ExpansionRule {
self.whitespace = WhitespaceHandling::AllowWhitespace;
self
}
#[allow(unused)]
pub fn reject_whitespace(mut self) -> ExpansionRule {
self.whitespace = WhitespaceHandling::RejectWhitespace;
self
}
}
/// If the caller of expand_atom throws away the returned atomic token returned, it
/// must use a checkpoint to roll it back.
pub fn expand_atom<'me, 'content>(
token_nodes: &'me mut TokensIterator<'content>,
expected: &'static str,
context: &ExpandContext,
rule: ExpansionRule,
) -> Result<TaggedAtomicToken<'content>, ShellError> {
if token_nodes.at_end() {
match rule.allow_eof {
true => {
return Ok(AtomicToken::Eof {
tag: Tag::unknown(),
}
.tagged_unknown())
}
false => return Err(ShellError::unexpected_eof("anything", Tag::unknown())),
}
}
// First, we'll need to handle the situation where more than one token corresponds
// to a single atomic token
// If treat_size_as_word, don't try to parse the head of the token stream
// as a size.
match rule.treat_size_as_word {
true => {}
false => match expand_syntax(&UnitShape, token_nodes, context) {
// If the head of the stream isn't a valid unit, we'll try to parse
// it again next as a word
Err(_) => {}
// But if it was a valid unit, we're done here
Ok(Tagged {
item: (number, unit),
tag,
}) => return Ok(AtomicToken::Size { number, unit }.tagged(tag)),
},
}
// Try to parse the head of the stream as a bare path. A bare path includes
// words as well as `.`s, connected together without whitespace.
match expand_syntax(&BarePathShape, token_nodes, context) {
// If we didn't find a bare path
Err(_) => {}
Ok(tag) => {
let next = token_nodes.peek_any();
match next.node {
Some(token) if token.is_pattern() => {
// if the very next token is a pattern, we're looking at a glob, not a
// word, and we should try to parse it as a glob next
}
_ => return Ok(AtomicToken::Word { text: tag }.tagged(tag)),
}
}
}
// Try to parse the head of the stream as a pattern. A pattern includes
// words, words with `*` as well as `.`s, connected together without whitespace.
match expand_syntax(&BarePatternShape, token_nodes, context) {
// If we didn't find a bare path
Err(_) => {}
Ok(tag) => return Ok(AtomicToken::GlobPattern { pattern: tag }.tagged(tag)),
}
// The next token corresponds to at most one atomic token
// We need to `peek` because `parse_single_node` doesn't cover all of the
// cases that `expand_atom` covers. We should probably collapse the two
// if possible.
let peeked = token_nodes.peek_any().not_eof(expected)?;
match peeked.node {
TokenNode::Token(_) => {
// handle this next
}
TokenNode::Error(error) => {
peeked.commit();
return Ok(AtomicToken::Error {
error: error.clone(),
}
.tagged(error.tag));
}
// [ ... ]
TokenNode::Delimited(Tagged {
item:
DelimitedNode {
delimiter: Delimiter::Square,
tags,
children,
},
tag,
}) => {
peeked.commit();
return Ok(AtomicToken::SquareDelimited {
nodes: children,
tags: *tags,
}
.tagged(tag));
}
TokenNode::Flag(Tagged {
item:
Flag {
kind: FlagKind::Shorthand,
name,
},
tag,
}) => {
peeked.commit();
return Ok(AtomicToken::ShorthandFlag { name: *name }.tagged(tag));
}
TokenNode::Flag(Tagged {
item:
Flag {
kind: FlagKind::Longhand,
name,
},
tag,
}) => {
peeked.commit();
return Ok(AtomicToken::ShorthandFlag { name: *name }.tagged(tag));
}
// If we see whitespace, process the whitespace according to the whitespace
// handling rules
TokenNode::Whitespace(tag) => match rule.whitespace {
// if whitespace is allowed, return a whitespace token
WhitespaceHandling::AllowWhitespace => {
peeked.commit();
return Ok(AtomicToken::Whitespace { text: *tag }.tagged(tag));
}
// if whitespace is disallowed, return an error
WhitespaceHandling::RejectWhitespace => {
return Err(ShellError::syntax_error(
"Unexpected whitespace".tagged(tag),
))
}
},
other => {
let tag = peeked.node.tag();
peeked.commit();
return Ok(AtomicToken::Error {
error: ShellError::type_error("token", other.tagged_type_name()).tagged(tag),
}
.tagged(tag));
}
}
parse_single_node(token_nodes, expected, |token, token_tag, err| {
Ok(match token {
// First, the error cases. Each error case corresponds to a expansion rule
// flag that can be used to allow the case
// rule.allow_operator
RawToken::Operator(_) if !rule.allow_operator => return Err(err.error()),
// rule.allow_external_command
RawToken::ExternalCommand(_) if !rule.allow_external_command => {
return Err(ShellError::type_error(
expected,
token.type_name().tagged(token_tag),
))
}
// rule.allow_external_word
RawToken::ExternalWord if !rule.allow_external_word => {
return Err(ShellError::invalid_external_word(token_tag))
}
RawToken::Number(number) => AtomicToken::Number { number }.tagged(token_tag),
RawToken::Operator(_) => AtomicToken::Operator { text: token_tag }.tagged(token_tag),
RawToken::String(body) => AtomicToken::String { body }.tagged(token_tag),
RawToken::Variable(name) if name.slice(context.source) == "it" => {
AtomicToken::ItVariable { name }.tagged(token_tag)
}
RawToken::Variable(name) => AtomicToken::Variable { name }.tagged(token_tag),
RawToken::ExternalCommand(command) => {
AtomicToken::ExternalCommand { command }.tagged(token_tag)
}
RawToken::ExternalWord => {
AtomicToken::ExternalWord { text: token_tag }.tagged(token_tag)
}
RawToken::GlobPattern => {
AtomicToken::GlobPattern { pattern: token_tag }.tagged(token_tag)
}
RawToken::Bare => AtomicToken::Word { text: token_tag }.tagged(token_tag),
})
})
}

View File

@ -1,38 +1,49 @@
use crate::parser::hir::syntax_shape::{expand_syntax, ExpandContext, ExpressionListShape};
use crate::parser::{hir, hir::TokensIterator};
use crate::parser::{DelimitedNode, Delimiter};
use crate::parser::hir::syntax_shape::{
color_syntax, expand_syntax, ColorSyntax, ExpandContext, ExpressionListShape, TokenNode,
};
use crate::parser::{hir, hir::TokensIterator, Delimiter, FlatShape};
use crate::prelude::*;
pub fn expand_delimited_expr(
delimited: &Tagged<DelimitedNode>,
pub fn expand_delimited_square(
children: &Vec<TokenNode>,
tag: Tag,
context: &ExpandContext,
) -> Result<hir::Expression, ShellError> {
match &delimited.item {
DelimitedNode {
delimiter: Delimiter::Square,
children,
} => {
let mut tokens = TokensIterator::new(&children, delimited.tag, false);
let mut tokens = TokensIterator::new(&children, tag, false);
let list = expand_syntax(&ExpressionListShape, &mut tokens, context);
Ok(hir::Expression::list(list?, delimited.tag))
}
Ok(hir::Expression::list(list?, tag))
}
DelimitedNode {
delimiter: Delimiter::Paren,
..
} => Err(ShellError::type_error(
"expression",
"unimplemented call expression".tagged(delimited.tag),
)),
pub fn color_delimited_square(
(open, close): (Tag, Tag),
children: &Vec<TokenNode>,
tag: Tag,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) {
shapes.push(FlatShape::OpenDelimiter(Delimiter::Square).tagged(open));
let mut tokens = TokensIterator::new(&children, tag, false);
let _list = color_syntax(&ExpressionListShape, &mut tokens, context, shapes);
shapes.push(FlatShape::CloseDelimiter(Delimiter::Square).tagged(close));
}
DelimitedNode {
delimiter: Delimiter::Brace,
..
} => Err(ShellError::type_error(
"expression",
"unimplemented block expression".tagged(delimited.tag),
)),
#[derive(Debug, Copy, Clone)]
pub struct DelimitedShape;
impl ColorSyntax for DelimitedShape {
type Info = ();
type Input = (Delimiter, Tag, Tag);
fn color_syntax<'a, 'b>(
&self,
(delimiter, open, close): &(Delimiter, Tag, Tag),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Self::Info {
shapes.push(FlatShape::OpenDelimiter(*delimiter).tagged(open));
color_syntax(&ExpressionListShape, token_nodes, context, shapes);
shapes.push(FlatShape::CloseDelimiter(*delimiter).tagged(close));
}
}

View File

@ -1,59 +1,71 @@
use crate::parser::hir::syntax_shape::expression::atom::{expand_atom, AtomicToken, ExpansionRule};
use crate::parser::hir::syntax_shape::{
expand_syntax, expression::expand_file_path, parse_single_node, BarePathShape, ExpandContext,
ExpandExpression,
expression::expand_file_path, ExpandContext, ExpandExpression, FallibleColorSyntax, FlatShape,
};
use crate::parser::{hir, hir::TokensIterator, RawToken};
use crate::parser::{hir, hir::TokensIterator};
use crate::prelude::*;
#[derive(Debug, Copy, Clone)]
pub struct FilePathShape;
impl FallibleColorSyntax for FilePathShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let atom = expand_atom(
token_nodes,
"file path",
context,
ExpansionRule::permissive(),
);
let atom = match atom {
Err(_) => return Ok(()),
Ok(atom) => atom,
};
match atom.item {
AtomicToken::Word { .. }
| AtomicToken::String { .. }
| AtomicToken::Number { .. }
| AtomicToken::Size { .. } => {
shapes.push(FlatShape::Path.tagged(atom.tag));
}
_ => atom.color_tokens(shapes),
}
Ok(())
}
}
impl ExpandExpression for FilePathShape {
fn expand_expr<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
) -> Result<hir::Expression, ShellError> {
let bare = expand_syntax(&BarePathShape, token_nodes, context);
let atom = expand_atom(token_nodes, "file path", context, ExpansionRule::new())?;
match bare {
Ok(tag) => {
let string = tag.slice(context.source);
let path = expand_file_path(string, context);
return Ok(hir::Expression::file_path(path, tag));
}
Err(_) => {}
match atom.item {
AtomicToken::Word { text: body } | AtomicToken::String { body } => {
let path = expand_file_path(body.slice(context.source), context);
return Ok(hir::Expression::file_path(path, atom.tag));
}
parse_single_node(token_nodes, "Path", |token, token_tag| {
Ok(match token {
RawToken::GlobPattern => {
return Err(ShellError::type_error(
"Path",
"glob pattern".tagged(token_tag),
))
AtomicToken::Number { .. } | AtomicToken::Size { .. } => {
let path = atom.tag.slice(context.source);
return Ok(hir::Expression::file_path(path, atom.tag));
}
RawToken::Operator(..) => {
return Err(ShellError::type_error("Path", "operator".tagged(token_tag)))
}
RawToken::Variable(tag) if tag.slice(context.source) == "it" => {
hir::Expression::it_variable(tag, token_tag)
}
RawToken::Variable(tag) => hir::Expression::variable(tag, token_tag),
RawToken::ExternalCommand(tag) => hir::Expression::external_command(tag, token_tag),
RawToken::ExternalWord => return Err(ShellError::invalid_external_word(token_tag)),
RawToken::Number(_) => hir::Expression::bare(token_tag),
RawToken::Size(_, _) => hir::Expression::bare(token_tag),
RawToken::Bare => hir::Expression::file_path(
expand_file_path(token_tag.slice(context.source), context),
token_tag,
),
RawToken::String(tag) => hir::Expression::file_path(
expand_file_path(tag.slice(context.source), context),
token_tag,
),
})
})
_ => return atom.into_hir(context, "file path"),
}
}
}

View File

@ -2,10 +2,14 @@ use crate::errors::ShellError;
use crate::parser::{
hir,
hir::syntax_shape::{
expand_expr, maybe_spaced, spaced, AnyExpressionShape, ExpandContext, ExpandSyntax,
color_fallible_syntax, color_syntax, expand_atom, expand_expr, maybe_spaced, spaced,
AnyExpressionShape, ColorSyntax, ExpandContext, ExpandSyntax, ExpansionRule,
MaybeSpaceShape, SpaceShape,
},
hir::{debug_tokens, TokensIterator},
hir::TokensIterator,
FlatShape,
};
use crate::Tagged;
#[derive(Debug, Copy, Clone)]
pub struct ExpressionListShape;
@ -28,8 +32,6 @@ impl ExpandSyntax for ExpressionListShape {
exprs.push(expr);
println!("{:?}", debug_tokens(token_nodes, context.source));
loop {
if token_nodes.at_end_possible_ws() {
return Ok(exprs);
@ -41,3 +43,134 @@ impl ExpandSyntax for ExpressionListShape {
}
}
}
impl ColorSyntax for ExpressionListShape {
type Info = ();
type Input = ();
/// The intent of this method is to fully color an expression list shape infallibly.
/// This means that if we can't expand a token into an expression, we fall back to
/// a simpler coloring strategy.
///
/// This would apply to something like `where x >`, which includes an incomplete
/// binary operator. Since we will fail to process it as a binary operator, we'll
/// fall back to a simpler coloring and move on.
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) {
// We encountered a parsing error and will continue with simpler coloring ("backoff
// coloring mode")
let mut backoff = false;
// Consume any leading whitespace
color_syntax(&MaybeSpaceShape, token_nodes, context, shapes);
loop {
// If we reached the very end of the token stream, we're done
if token_nodes.at_end() {
return;
}
if backoff {
let len = shapes.len();
// If we previously encountered a parsing error, use backoff coloring mode
color_syntax(&SimplestExpression, token_nodes, context, shapes);
if len == shapes.len() && !token_nodes.at_end() {
// This should never happen, but if it does, a panic is better than an infinite loop
panic!("Unexpected tokens left that couldn't be colored even with SimplestExpression")
}
} else {
// Try to color the head of the stream as an expression
match color_fallible_syntax(&AnyExpressionShape, token_nodes, context, shapes) {
// If no expression was found, switch to backoff coloring mode
Err(_) => {
backoff = true;
continue;
}
Ok(_) => {}
}
// If an expression was found, consume a space
match color_fallible_syntax(&SpaceShape, token_nodes, context, shapes) {
Err(_) => {
// If no space was found, we're either at the end or there's an error.
// Either way, switch to backoff coloring mode. If we're at the end
// it won't have any consequences.
backoff = true;
}
Ok(_) => {
// Otherwise, move on to the next expression
}
}
}
}
}
}
/// BackoffColoringMode consumes all of the remaining tokens in an infallible way
#[derive(Debug, Copy, Clone)]
pub struct BackoffColoringMode;
impl ColorSyntax for BackoffColoringMode {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &Self::Input,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Self::Info {
loop {
if token_nodes.at_end() {
break;
}
let len = shapes.len();
color_syntax(&SimplestExpression, token_nodes, context, shapes);
if len == shapes.len() && !token_nodes.at_end() {
// This shouldn't happen, but if it does, a panic is better than an infinite loop
panic!("SimplestExpression failed to consume any tokens, but it's not at the end. This is unexpected\n== token nodes==\n{:#?}\n\n== shapes ==\n{:#?}", token_nodes, shapes);
}
}
}
}
/// The point of `SimplestExpression` is to serve as an infallible base case for coloring.
/// As a last ditch effort, if we can't find any way to parse the head of the stream as an
/// expression, fall back to simple coloring.
#[derive(Debug, Copy, Clone)]
pub struct SimplestExpression;
impl ColorSyntax for SimplestExpression {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) {
let atom = expand_atom(
token_nodes,
"any token",
context,
ExpansionRule::permissive(),
);
match atom {
Err(_) => {}
Ok(atom) => atom.color_tokens(shapes),
}
}
}

View File

@ -1,4 +1,7 @@
use crate::parser::hir::syntax_shape::{parse_single_node, ExpandContext, ExpandExpression};
use crate::parser::hir::syntax_shape::{
expand_atom, parse_single_node, ExpandContext, ExpandExpression, ExpansionRule,
FallibleColorSyntax, FlatShape,
};
use crate::parser::{
hir,
hir::{RawNumber, TokensIterator},
@ -15,20 +18,9 @@ impl ExpandExpression for NumberShape {
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
) -> Result<hir::Expression, ShellError> {
parse_single_node(token_nodes, "Number", |token, token_tag| {
parse_single_node(token_nodes, "Number", |token, token_tag, err| {
Ok(match token {
RawToken::GlobPattern => {
return Err(ShellError::type_error(
"Number",
"glob pattern".to_string().tagged(token_tag),
))
}
RawToken::Operator(..) => {
return Err(ShellError::type_error(
"Number",
"operator".to_string().tagged(token_tag),
))
}
RawToken::GlobPattern | RawToken::Operator(..) => return Err(err.error()),
RawToken::Variable(tag) if tag.slice(context.source) == "it" => {
hir::Expression::it_variable(tag, token_tag)
}
@ -38,9 +30,6 @@ impl ExpandExpression for NumberShape {
RawToken::Number(number) => {
hir::Expression::number(number.to_number(context.source), token_tag)
}
RawToken::Size(number, unit) => {
hir::Expression::size(number.to_number(context.source), unit, token_tag)
}
RawToken::Bare => hir::Expression::bare(token_tag),
RawToken::String(tag) => hir::Expression::string(tag, token_tag),
})
@ -48,6 +37,35 @@ impl ExpandExpression for NumberShape {
}
}
impl FallibleColorSyntax for NumberShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let atom = token_nodes.spanned(|token_nodes| {
expand_atom(token_nodes, "number", context, ExpansionRule::permissive())
});
let atom = match atom {
Tagged { item: Err(_), tag } => {
shapes.push(FlatShape::Error.tagged(tag));
return Ok(());
}
Tagged { item: Ok(atom), .. } => atom,
};
atom.color_tokens(shapes);
Ok(())
}
}
#[derive(Debug, Copy, Clone)]
pub struct IntShape;
@ -57,41 +75,51 @@ impl ExpandExpression for IntShape {
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
) -> Result<hir::Expression, ShellError> {
parse_single_node(token_nodes, "Integer", |token, token_tag| {
parse_single_node(token_nodes, "Integer", |token, token_tag, err| {
Ok(match token {
RawToken::GlobPattern => {
return Err(ShellError::type_error(
"Integer",
"glob pattern".to_string().tagged(token_tag),
))
}
RawToken::Operator(..) => {
return Err(ShellError::type_error(
"Integer",
"operator".to_string().tagged(token_tag),
))
}
RawToken::GlobPattern | RawToken::Operator(..) => return Err(err.error()),
RawToken::ExternalWord => return Err(ShellError::invalid_external_word(token_tag)),
RawToken::Variable(tag) if tag.slice(context.source) == "it" => {
hir::Expression::it_variable(tag, token_tag)
}
RawToken::ExternalCommand(tag) => hir::Expression::external_command(tag, token_tag),
RawToken::ExternalWord => return Err(ShellError::invalid_external_word(token_tag)),
RawToken::Variable(tag) => hir::Expression::variable(tag, token_tag),
RawToken::Number(number @ RawNumber::Int(_)) => {
hir::Expression::number(number.to_number(context.source), token_tag)
}
token @ RawToken::Number(_) => {
return Err(ShellError::type_error(
"Integer",
token.type_name().tagged(token_tag),
));
}
RawToken::Size(number, unit) => {
hir::Expression::size(number.to_number(context.source), unit, token_tag)
}
RawToken::Number(_) => return Err(err.error()),
RawToken::Bare => hir::Expression::bare(token_tag),
RawToken::String(tag) => hir::Expression::string(tag, token_tag),
})
})
}
}
impl FallibleColorSyntax for IntShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let atom = token_nodes.spanned(|token_nodes| {
expand_atom(token_nodes, "integer", context, ExpansionRule::permissive())
});
let atom = match atom {
Tagged { item: Err(_), tag } => {
shapes.push(FlatShape::Error.tagged(tag));
return Ok(());
}
Tagged { item: Ok(atom), .. } => atom,
};
atom.color_tokens(shapes);
Ok(())
}
}

View File

@ -1,6 +1,7 @@
use crate::parser::hir::syntax_shape::{
expand_bare, expand_syntax, expression::expand_file_path, parse_single_node, ExpandContext,
ExpandExpression, ExpandSyntax,
expand_atom, expand_bare, expand_syntax, expression::expand_file_path, parse_single_node,
AtomicToken, ExpandContext, ExpandExpression, ExpandSyntax, ExpansionRule, FallibleColorSyntax,
FlatShape,
};
use crate::parser::{hir, hir::TokensIterator, Operator, RawToken, TokenNode};
use crate::prelude::*;
@ -8,6 +9,32 @@ use crate::prelude::*;
#[derive(Debug, Copy, Clone)]
pub struct PatternShape;
impl FallibleColorSyntax for PatternShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
token_nodes.atomic(|token_nodes| {
let atom = expand_atom(token_nodes, "pattern", context, ExpansionRule::permissive())?;
match &atom.item {
AtomicToken::GlobPattern { .. } | AtomicToken::Word { .. } => {
shapes.push(FlatShape::GlobPattern.tagged(atom.tag));
Ok(())
}
_ => Err(ShellError::type_error("pattern", atom.tagged_type_name())),
}
})
}
}
impl ExpandExpression for PatternShape {
fn expand_expr<'a, 'b>(
&self,
@ -23,7 +50,7 @@ impl ExpandExpression for PatternShape {
Err(_) => {}
}
parse_single_node(token_nodes, "Pattern", |token, token_tag| {
parse_single_node(token_nodes, "Pattern", |token, token_tag, _| {
Ok(match token {
RawToken::GlobPattern => {
return Err(ShellError::unreachable(
@ -44,7 +71,6 @@ impl ExpandExpression for PatternShape {
RawToken::ExternalCommand(tag) => hir::Expression::external_command(tag, token_tag),
RawToken::ExternalWord => return Err(ShellError::invalid_external_word(token_tag)),
RawToken::Number(_) => hir::Expression::bare(token_tag),
RawToken::Size(_, _) => hir::Expression::bare(token_tag),
RawToken::String(tag) => hir::Expression::file_path(
expand_file_path(tag.slice(context.source), context),

View File

@ -1,5 +1,6 @@
use crate::parser::hir::syntax_shape::{
expand_variable, parse_single_node, ExpandContext, ExpandExpression, TestSyntax,
expand_atom, expand_variable, parse_single_node, AtomicToken, ExpandContext, ExpandExpression,
ExpansionRule, FallibleColorSyntax, FlatShape, TestSyntax,
};
use crate::parser::hir::tokens_iterator::Peeked;
use crate::parser::{hir, hir::TokensIterator, RawToken, TokenNode};
@ -8,13 +9,43 @@ use crate::prelude::*;
#[derive(Debug, Copy, Clone)]
pub struct StringShape;
impl FallibleColorSyntax for StringShape {
type Info = ();
type Input = FlatShape;
fn color_syntax<'a, 'b>(
&self,
input: &FlatShape,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let atom = expand_atom(token_nodes, "string", context, ExpansionRule::permissive());
let atom = match atom {
Err(_) => return Ok(()),
Ok(atom) => atom,
};
match atom {
Tagged {
item: AtomicToken::String { .. },
tag,
} => shapes.push((*input).tagged(tag)),
other => other.color_tokens(shapes),
}
Ok(())
}
}
impl ExpandExpression for StringShape {
fn expand_expr<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
) -> Result<hir::Expression, ShellError> {
parse_single_node(token_nodes, "String", |token, token_tag| {
parse_single_node(token_nodes, "String", |token, token_tag, _| {
Ok(match token {
RawToken::GlobPattern => {
return Err(ShellError::type_error(
@ -32,7 +63,6 @@ impl ExpandExpression for StringShape {
RawToken::ExternalCommand(tag) => hir::Expression::external_command(tag, token_tag),
RawToken::ExternalWord => return Err(ShellError::invalid_external_word(token_tag)),
RawToken::Number(_) => hir::Expression::bare(token_tag),
RawToken::Size(_, _) => hir::Expression::bare(token_tag),
RawToken::Bare => hir::Expression::bare(token_tag),
RawToken::String(tag) => hir::Expression::string(tag, token_tag),
})

View File

@ -1,7 +1,8 @@
use crate::parser::hir::syntax_shape::{ExpandContext, ExpandExpression};
use crate::data::meta::Span;
use crate::parser::hir::syntax_shape::{ExpandContext, ExpandSyntax};
use crate::parser::parse::tokens::RawNumber;
use crate::parser::parse::unit::Unit;
use crate::parser::{hir, hir::TokensIterator, RawToken, TokenNode};
use crate::parser::{hir::TokensIterator, RawToken, TokenNode};
use crate::prelude::*;
use nom::branch::alt;
use nom::bytes::complete::tag;
@ -12,12 +13,14 @@ use nom::IResult;
#[derive(Debug, Copy, Clone)]
pub struct UnitShape;
impl ExpandExpression for UnitShape {
fn expand_expr<'a, 'b>(
impl ExpandSyntax for UnitShape {
type Output = Tagged<(Tagged<RawNumber>, Tagged<Unit>)>;
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<hir::Expression, ShellError> {
) -> Result<Tagged<(Tagged<RawNumber>, Tagged<Unit>)>, ShellError> {
let peeked = token_nodes.peek_any().not_eof("unit")?;
let tag = match peeked.node {
@ -40,15 +43,12 @@ impl ExpandExpression for UnitShape {
Ok((number, unit)) => (number, unit),
};
Ok(hir::Expression::size(
number.to_number(context.source),
unit,
tag,
))
peeked.commit();
Ok((number, unit).tagged(tag))
}
}
fn unit_size(input: &str, bare_tag: Tag) -> IResult<&str, (Tagged<RawNumber>, Unit)> {
fn unit_size(input: &str, bare_tag: Tag) -> IResult<&str, (Tagged<RawNumber>, Tagged<Unit>)> {
let (input, digits) = digit1(input)?;
let (input, dot) = opt(tag("."))(input)?;
@ -85,5 +85,12 @@ fn unit_size(input: &str, bare_tag: Tag) -> IResult<&str, (Tagged<RawNumber>, Un
value(Unit::MB, alt((tag("PB"), tag("pb"), tag("Pb")))),
)))(input)?;
Ok((input, (number, unit)))
let start_span = number.tag.span.end();
let unit_tag = Tag::new(
bare_tag.anchor,
Span::from((start_span, bare_tag.span.end())),
);
Ok((input, (number, unit.tagged(unit_tag))))
}

View File

@ -1,6 +1,8 @@
use crate::parser::hir::syntax_shape::{
expand_expr, expand_syntax, parse_single_node, AnyExpressionShape, BareShape, ExpandContext,
ExpandExpression, ExpandSyntax, Peeked, SkipSyntax, StringShape, TestSyntax, WhitespaceShape,
color_fallible_syntax, color_fallible_syntax_with, expand_atom, expand_expr, expand_syntax,
parse_single_node, AnyExpressionShape, AtomicToken, BareShape, ExpandContext, ExpandExpression,
ExpandSyntax, ExpansionRule, FallibleColorSyntax, FlatShape, Peeked, SkipSyntax, StringShape,
TestSyntax, WhitespaceShape,
};
use crate::parser::{hir, hir::Expression, hir::TokensIterator, Operator, RawToken};
use crate::prelude::*;
@ -42,9 +44,81 @@ impl ExpandExpression for VariablePathShape {
}
}
impl FallibleColorSyntax for VariablePathShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
token_nodes.atomic(|token_nodes| {
// If the head of the token stream is not a variable, fail
color_fallible_syntax(&VariableShape, token_nodes, context, shapes)?;
loop {
// look for a dot at the head of a stream
let dot = color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Dot,
token_nodes,
context,
shapes,
);
// if there's no dot, we're done
match dot {
Err(_) => break,
Ok(_) => {}
}
// otherwise, look for a member, and if you don't find one, fail
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)?;
}
Ok(())
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct PathTailShape;
/// The failure mode of `PathTailShape` is a dot followed by a non-member
impl FallibleColorSyntax for PathTailShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
token_nodes.atomic(|token_nodes| loop {
let result = color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Dot,
token_nodes,
context,
shapes,
);
match result {
Err(_) => return Ok(()),
Ok(_) => {}
}
// If we've seen a dot but not a member, fail
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)?;
})
}
}
impl ExpandSyntax for PathTailShape {
type Output = (Vec<Tagged<String>>, Tag);
fn expand_syntax<'a, 'b>(
@ -121,6 +195,63 @@ impl ExpandSyntax for ExpressionContinuationShape {
}
}
pub enum ContinuationInfo {
Dot,
Infix,
}
impl FallibleColorSyntax for ExpressionContinuationShape {
type Info = ContinuationInfo;
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<ContinuationInfo, ShellError> {
token_nodes.atomic(|token_nodes| {
// Try to expand a `.`
let dot = color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Dot,
token_nodes,
context,
shapes,
);
match dot {
Ok(_) => {
// we found a dot, so let's keep looking for a member; if no member was found, fail
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)?;
Ok(ContinuationInfo::Dot)
}
Err(_) => {
let mut new_shapes = vec![];
let result = token_nodes.atomic(|token_nodes| {
// we didn't find a dot, so let's see if we're looking at an infix. If not found, fail
color_fallible_syntax(&InfixShape, token_nodes, context, &mut new_shapes)?;
// now that we've seen an infix shape, look for any expression. If not found, fail
color_fallible_syntax(
&AnyExpressionShape,
token_nodes,
context,
&mut new_shapes,
)?;
Ok(ContinuationInfo::Infix)
})?;
shapes.extend(new_shapes);
Ok(result)
}
}
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct VariableShape;
@ -130,7 +261,7 @@ impl ExpandExpression for VariableShape {
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
) -> Result<hir::Expression, ShellError> {
parse_single_node(token_nodes, "variable", |token, token_tag| {
parse_single_node(token_nodes, "variable", |token, token_tag, _| {
Ok(match token {
RawToken::Variable(tag) => {
if tag.slice(context.source) == "it" {
@ -150,6 +281,43 @@ impl ExpandExpression for VariableShape {
}
}
impl FallibleColorSyntax for VariableShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let atom = expand_atom(
token_nodes,
"variable",
context,
ExpansionRule::permissive(),
);
let atom = match atom {
Err(err) => return Err(err),
Ok(atom) => atom,
};
match &atom.item {
AtomicToken::Variable { .. } => {
shapes.push(FlatShape::Variable.tagged(atom.tag));
Ok(())
}
AtomicToken::ItVariable { .. } => {
shapes.push(FlatShape::ItVariable.tagged(atom.tag));
Ok(())
}
_ => Err(ShellError::type_error("variable", atom.tagged_type_name())),
}
}
}
#[derive(Debug, Clone, Copy)]
pub enum Member {
String(/* outer */ Tag, /* inner */ Tag),
@ -272,6 +440,55 @@ pub fn expand_column_path<'a, 'b>(
#[derive(Debug, Copy, Clone)]
pub struct ColumnPathShape;
impl FallibleColorSyntax for ColumnPathShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
// If there's not even one member shape, fail
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)?;
loop {
let checkpoint = token_nodes.checkpoint();
match color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Dot,
checkpoint.iterator,
context,
shapes,
) {
Err(_) => {
// we already saw at least one member shape, so return successfully
return Ok(());
}
Ok(_) => {
match color_fallible_syntax(&MemberShape, checkpoint.iterator, context, shapes)
{
Err(_) => {
// we saw a dot but not a member (but we saw at least one member),
// so don't commit the dot but return successfully
return Ok(());
}
Ok(_) => {
// we saw a dot and a member, so commit it and continue on
checkpoint.commit();
}
}
}
}
}
}
}
impl ExpandSyntax for ColumnPathShape {
type Output = Tagged<Vec<Member>>;
@ -287,6 +504,43 @@ impl ExpandSyntax for ColumnPathShape {
#[derive(Debug, Copy, Clone)]
pub struct MemberShape;
impl FallibleColorSyntax for MemberShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let bare = color_fallible_syntax_with(
&BareShape,
&FlatShape::BareMember,
token_nodes,
context,
shapes,
);
match bare {
Ok(_) => return Ok(()),
Err(_) => {
// If we don't have a bare word, we'll look for a string
}
}
// Look for a string token. If we don't find one, fail
color_fallible_syntax_with(
&StringShape,
&FlatShape::StringMember,
token_nodes,
context,
shapes,
)
}
}
impl ExpandSyntax for MemberShape {
type Output = Member;
@ -317,6 +571,34 @@ impl ExpandSyntax for MemberShape {
#[derive(Debug, Copy, Clone)]
pub struct DotShape;
#[derive(Debug, Copy, Clone)]
pub struct ColorableDotShape;
impl FallibleColorSyntax for ColorableDotShape {
type Info = ();
type Input = FlatShape;
fn color_syntax<'a, 'b>(
&self,
input: &FlatShape,
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let peeked = token_nodes.peek_any().not_eof("dot")?;
match peeked.node {
node if node.is_dot() => {
peeked.commit();
shapes.push((*input).tagged(node.tag()));
Ok(())
}
other => Err(ShellError::type_error("dot", other.tagged_type_name())),
}
}
}
impl SkipSyntax for DotShape {
fn skip<'a, 'b>(
&self,
@ -337,7 +619,7 @@ impl ExpandSyntax for DotShape {
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
) -> Result<Self::Output, ShellError> {
parse_single_node(token_nodes, "dot", |token, token_tag| {
parse_single_node(token_nodes, "dot", |token, token_tag, _| {
Ok(match token {
RawToken::Operator(Operator::Dot) => token_tag,
_ => {
@ -354,6 +636,53 @@ impl ExpandSyntax for DotShape {
#[derive(Debug, Copy, Clone)]
pub struct InfixShape;
impl FallibleColorSyntax for InfixShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
outer_shapes: &mut Vec<Tagged<FlatShape>>,
) -> Result<(), ShellError> {
let checkpoint = token_nodes.checkpoint();
let mut shapes = vec![];
// An infix operator must be prefixed by whitespace. If no whitespace was found, fail
color_fallible_syntax(&WhitespaceShape, checkpoint.iterator, context, &mut shapes)?;
// Parse the next TokenNode after the whitespace
parse_single_node(
checkpoint.iterator,
"infix operator",
|token, token_tag, _| {
match token {
// If it's an operator (and not `.`), it's a match
RawToken::Operator(operator) if operator != Operator::Dot => {
shapes.push(FlatShape::Operator.tagged(token_tag));
Ok(())
}
// Otherwise, it's not a match
_ => Err(ShellError::type_error(
"infix operator",
token.type_name().tagged(token_tag),
)),
}
},
)?;
// An infix operator must be followed by whitespace. If no whitespace was found, fail
color_fallible_syntax(&WhitespaceShape, checkpoint.iterator, context, &mut shapes)?;
outer_shapes.extend(shapes);
checkpoint.commit();
Ok(())
}
}
impl ExpandSyntax for InfixShape {
type Output = (Tag, Tagged<Operator>, Tag);
@ -368,8 +697,10 @@ impl ExpandSyntax for InfixShape {
let start = expand_syntax(&WhitespaceShape, checkpoint.iterator, context)?;
// Parse the next TokenNode after the whitespace
let operator =
parse_single_node(checkpoint.iterator, "infix operator", |token, token_tag| {
let operator = parse_single_node(
checkpoint.iterator,
"infix operator",
|token, token_tag, _| {
Ok(match token {
// If it's an operator (and not `.`), it's a match
RawToken::Operator(operator) if operator != Operator::Dot => {
@ -384,7 +715,8 @@ impl ExpandSyntax for InfixShape {
))
}
})
})?;
},
)?;
// An infix operator must be followed by whitespace
let end = expand_syntax(&WhitespaceShape, checkpoint.iterator, context)?;

View File

@ -0,0 +1,95 @@
use crate::parser::{Delimiter, Flag, FlagKind, Operator, RawNumber, RawToken, TokenNode};
use crate::{Tag, Tagged, TaggedItem, Text};
#[derive(Debug, Copy, Clone)]
pub enum FlatShape {
OpenDelimiter(Delimiter),
CloseDelimiter(Delimiter),
ItVariable,
Variable,
Operator,
Dot,
InternalCommand,
ExternalCommand,
ExternalWord,
BareMember,
StringMember,
String,
Path,
Word,
Pipe,
GlobPattern,
Flag,
ShorthandFlag,
Int,
Decimal,
Whitespace,
Error,
Size { number: Tag, unit: Tag },
}
impl FlatShape {
pub fn from(token: &TokenNode, source: &Text, shapes: &mut Vec<Tagged<FlatShape>>) -> () {
match token {
TokenNode::Token(token) => match token.item {
RawToken::Number(RawNumber::Int(_)) => {
shapes.push(FlatShape::Int.tagged(token.tag))
}
RawToken::Number(RawNumber::Decimal(_)) => {
shapes.push(FlatShape::Decimal.tagged(token.tag))
}
RawToken::Operator(Operator::Dot) => shapes.push(FlatShape::Dot.tagged(token.tag)),
RawToken::Operator(_) => shapes.push(FlatShape::Operator.tagged(token.tag)),
RawToken::String(_) => shapes.push(FlatShape::String.tagged(token.tag)),
RawToken::Variable(v) if v.slice(source) == "it" => {
shapes.push(FlatShape::ItVariable.tagged(token.tag))
}
RawToken::Variable(_) => shapes.push(FlatShape::Variable.tagged(token.tag)),
RawToken::ExternalCommand(_) => {
shapes.push(FlatShape::ExternalCommand.tagged(token.tag))
}
RawToken::ExternalWord => shapes.push(FlatShape::ExternalWord.tagged(token.tag)),
RawToken::GlobPattern => shapes.push(FlatShape::GlobPattern.tagged(token.tag)),
RawToken::Bare => shapes.push(FlatShape::Word.tagged(token.tag)),
},
TokenNode::Call(_) => unimplemented!(),
TokenNode::Nodes(nodes) => {
for node in &nodes.item {
FlatShape::from(node, source, shapes);
}
}
TokenNode::Delimited(v) => {
shapes.push(FlatShape::OpenDelimiter(v.item.delimiter).tagged(v.item.tags.0));
for token in &v.item.children {
FlatShape::from(token, source, shapes);
}
shapes.push(FlatShape::CloseDelimiter(v.item.delimiter).tagged(v.item.tags.1));
}
TokenNode::Pipeline(pipeline) => {
for part in &pipeline.parts {
if let Some(_) = part.pipe {
shapes.push(FlatShape::Pipe.tagged(part.tag));
}
}
}
TokenNode::Flag(Tagged {
item:
Flag {
kind: FlagKind::Longhand,
..
},
tag,
}) => shapes.push(FlatShape::Flag.tagged(tag)),
TokenNode::Flag(Tagged {
item:
Flag {
kind: FlagKind::Shorthand,
..
},
tag,
}) => shapes.push(FlatShape::ShorthandFlag.tagged(tag)),
TokenNode::Whitespace(_) => shapes.push(FlatShape::Whitespace.tagged(token.tag())),
TokenNode::Error(v) => shapes.push(FlatShape::Error.tagged(v.tag)),
}
}
}

View File

@ -3,16 +3,13 @@ pub(crate) mod debug;
use crate::errors::ShellError;
use crate::parser::TokenNode;
use crate::{Tag, Tagged, TaggedItem};
use derive_new::new;
#[derive(Debug, new)]
pub struct TokensIterator<'a> {
tokens: &'a [TokenNode],
#[derive(Debug)]
pub struct TokensIterator<'content> {
tokens: &'content [TokenNode],
tag: Tag,
skip_ws: bool,
#[new(default)]
index: usize,
#[new(default)]
seen: indexmap::IndexSet<usize>,
}
@ -124,11 +121,41 @@ pub fn peek_error(
}
impl<'content> TokensIterator<'content> {
#[cfg(test)]
pub fn new(items: &'content [TokenNode], tag: Tag, skip_ws: bool) -> TokensIterator<'content> {
TokensIterator {
tokens: items,
tag,
skip_ws,
index: 0,
seen: indexmap::IndexSet::new(),
}
}
pub fn anchor(&self) -> uuid::Uuid {
self.tag.anchor
}
pub fn all(tokens: &'content [TokenNode], tag: Tag) -> TokensIterator<'content> {
TokensIterator::new(tokens, tag, false)
}
pub fn len(&self) -> usize {
self.tokens.len()
}
pub fn spanned<T>(
&mut self,
block: impl FnOnce(&mut TokensIterator<'content>) -> T,
) -> Tagged<T> {
let start = self.tag_at_cursor();
let result = block(self);
let end = self.tag_at_cursor();
result.tagged(start.until(end))
}
/// Use a checkpoint when you need to peek more than one token ahead, but can't be sure
/// that you'll succeed.
pub fn checkpoint<'me>(&'me mut self) -> Checkpoint<'content, 'me> {
@ -143,8 +170,26 @@ impl<'content> TokensIterator<'content> {
}
}
pub fn anchor(&self) -> uuid::Uuid {
self.tag.anchor
/// Use a checkpoint when you need to peek more than one token ahead, but can't be sure
/// that you'll succeed.
pub fn atomic<'me, T>(
&'me mut self,
block: impl FnOnce(&mut TokensIterator<'content>) -> Result<T, ShellError>,
) -> Result<T, ShellError> {
let index = self.index;
let seen = self.seen.clone();
let checkpoint = Checkpoint {
iterator: self,
index,
seen,
committed: false,
};
let value = block(checkpoint.iterator)?;
checkpoint.commit();
return Ok(value);
}
fn eof_tag(&self) -> Tag {
@ -160,6 +205,15 @@ impl<'content> TokensIterator<'content> {
}
}
pub fn tag_at_cursor(&mut self) -> Tag {
let next = self.peek_any();
match next.node {
None => self.eof_tag(),
Some(node) => node.tag(),
}
}
pub fn remove(&mut self, position: usize) {
self.seen.insert(position);
}
@ -231,6 +285,26 @@ impl<'content> TokensIterator<'content> {
start_next(self, false)
}
// Peek the next token, including whitespace, but not EOF
pub fn peek_any_token<'me, T>(
&'me mut self,
block: impl FnOnce(&'content TokenNode) -> Result<T, ShellError>,
) -> Result<T, ShellError> {
let peeked = start_next(self, false);
let peeked = peeked.not_eof("invariant");
match peeked {
Err(err) => return Err(err),
Ok(peeked) => match block(peeked.node) {
Err(err) => return Err(err),
Ok(val) => {
peeked.commit();
return Ok(val);
}
},
}
}
fn commit(&mut self, from: usize, to: usize) {
for index in from..to {
self.seen.insert(index);
@ -239,6 +313,10 @@ impl<'content> TokensIterator<'content> {
self.index = to;
}
pub fn pos(&self, skip_ws: bool) -> Option<usize> {
peek_pos(self, skip_ws)
}
pub fn debug_remaining(&self) -> Vec<TokenNode> {
let mut tokens = self.clone();
tokens.restart();
@ -246,18 +324,18 @@ impl<'content> TokensIterator<'content> {
}
}
impl<'a> Iterator for TokensIterator<'a> {
type Item = &'a TokenNode;
impl<'content> Iterator for TokensIterator<'content> {
type Item = &'content TokenNode;
fn next(&mut self) -> Option<&'a TokenNode> {
fn next(&mut self) -> Option<&'content TokenNode> {
next(self, self.skip_ws)
}
}
fn peek<'content, 'me>(
iterator: &TokensIterator<'content>,
iterator: &'me TokensIterator<'content>,
skip_ws: bool,
) -> Option<&'content TokenNode> {
) -> Option<&'me TokenNode> {
let mut to = iterator.index;
loop {
@ -287,6 +365,37 @@ fn peek<'content, 'me>(
}
}
fn peek_pos<'content, 'me>(
iterator: &'me TokensIterator<'content>,
skip_ws: bool,
) -> Option<usize> {
let mut to = iterator.index;
loop {
if to >= iterator.tokens.len() {
return None;
}
if iterator.seen.contains(&to) {
to += 1;
continue;
}
if to >= iterator.tokens.len() {
return None;
}
let node = &iterator.tokens[to];
match node {
TokenNode::Whitespace(_) if skip_ws => {
to += 1;
}
_ => return Some(to),
}
}
}
fn start_next<'content, 'me>(
iterator: &'me mut TokensIterator<'content>,
skip_ws: bool,
@ -337,7 +446,10 @@ fn start_next<'content, 'me>(
}
}
fn next<'a>(iterator: &mut TokensIterator<'a>, skip_ws: bool) -> Option<&'a TokenNode> {
fn next<'me, 'content>(
iterator: &'me mut TokensIterator<'content>,
skip_ws: bool,
) -> Option<&'content TokenNode> {
loop {
if iterator.index >= iterator.tokens.len() {
return None;

View File

@ -1,4 +1,5 @@
use crate::Tag;
use crate::parser::hir::syntax_shape::flat_shape::FlatShape;
use crate::{Tag, Tagged, TaggedItem};
use derive_new::new;
use getset::Getters;
use serde::{Deserialize, Serialize};
@ -12,6 +13,15 @@ pub enum FlagKind {
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Getters, new)]
#[get = "pub(crate)"]
pub struct Flag {
kind: FlagKind,
name: Tag,
pub(crate) kind: FlagKind,
pub(crate) name: Tag,
}
impl Tagged<Flag> {
pub fn color(&self) -> Tagged<FlatShape> {
match self.item.kind {
FlagKind::Longhand => FlatShape::Flag.tagged(self.tag),
FlagKind::Shorthand => FlatShape::ShorthandFlag.tagged(self.tag),
}
}
}

View File

@ -189,7 +189,7 @@ pub fn raw_number(input: NomSpan) -> IResult<NomSpan, Tagged<RawNumber>> {
match input.fragment.chars().next() {
None => return Ok((input, RawNumber::int((start, input.offset, input.extra)))),
Some('.') => (),
Some(other) if other.is_whitespace() => {
other if is_boundary(other) => {
return Ok((input, RawNumber::int((start, input.offset, input.extra))))
}
_ => {
@ -215,16 +215,14 @@ pub fn raw_number(input: NomSpan) -> IResult<NomSpan, Tagged<RawNumber>> {
let next = input.fragment.chars().next();
if let Some(next) = next {
if !next.is_whitespace() {
return Err(nom::Err::Error(nom::error::make_error(
if is_boundary(next) {
Ok((input, RawNumber::decimal((start, end, input.extra))))
} else {
Err(nom::Err::Error(nom::error::make_error(
input,
nom::error::ErrorKind::Tag,
)));
)))
}
}
Ok((input, RawNumber::decimal((start, end, input.extra))))
}
#[tracable_parser]
@ -476,11 +474,14 @@ pub fn whitespace(input: NomSpan) -> IResult<NomSpan, TokenNode> {
))
}
pub fn delimited(input: NomSpan, delimiter: Delimiter) -> IResult<NomSpan, Tagged<Vec<TokenNode>>> {
pub fn delimited(
input: NomSpan,
delimiter: Delimiter,
) -> IResult<NomSpan, (Tag, Tag, Tagged<Vec<TokenNode>>)> {
let left = input.offset;
let (input, _) = char(delimiter.open())(input)?;
let (input, open_tag) = tag(delimiter.open())(input)?;
let (input, inner_items) = opt(spaced_token_list)(input)?;
let (input, _) = char(delimiter.close())(input)?;
let (input, close_tag) = tag(delimiter.close())(input)?;
let right = input.offset;
let mut items = vec![];
@ -489,36 +490,43 @@ pub fn delimited(input: NomSpan, delimiter: Delimiter) -> IResult<NomSpan, Tagge
items.extend(inner_items.item);
}
Ok((input, items.tagged((left, right, input.extra.origin))))
Ok((
input,
(
Tag::from(open_tag),
Tag::from(close_tag),
items.tagged((left, right, input.extra.origin)),
),
))
}
#[tracable_parser]
pub fn delimited_paren(input: NomSpan) -> IResult<NomSpan, TokenNode> {
let (input, tokens) = delimited(input, Delimiter::Paren)?;
let (input, (left, right, tokens)) = delimited(input, Delimiter::Paren)?;
Ok((
input,
TokenTreeBuilder::tagged_parens(tokens.item, tokens.tag),
TokenTreeBuilder::tagged_parens(tokens.item, (left, right), tokens.tag),
))
}
#[tracable_parser]
pub fn delimited_square(input: NomSpan) -> IResult<NomSpan, TokenNode> {
let (input, tokens) = delimited(input, Delimiter::Square)?;
let (input, (left, right, tokens)) = delimited(input, Delimiter::Square)?;
Ok((
input,
TokenTreeBuilder::tagged_square(tokens.item, tokens.tag),
TokenTreeBuilder::tagged_square(tokens.item, (left, right), tokens.tag),
))
}
#[tracable_parser]
pub fn delimited_brace(input: NomSpan) -> IResult<NomSpan, TokenNode> {
let (input, tokens) = delimited(input, Delimiter::Brace)?;
let (input, (left, right, tokens)) = delimited(input, Delimiter::Brace)?;
Ok((
input,
TokenTreeBuilder::tagged_brace(tokens.item, tokens.tag),
TokenTreeBuilder::tagged_square(tokens.item, (left, right), tokens.tag),
))
}
@ -1246,7 +1254,10 @@ mod tests {
left: usize,
right: usize,
) -> TokenNode {
let node = DelimitedNode::new(*delimiter, children);
let start = Tag::for_char(left, delimiter.tag.anchor);
let end = Tag::for_char(right, delimiter.tag.anchor);
let node = DelimitedNode::new(delimiter.item, (start, end), children);
let spanned = node.tagged((left, right, delimiter.tag.anchor));
TokenNode::Delimited(spanned)
}

View File

@ -1,5 +1,5 @@
use crate::errors::ShellError;
use crate::parser::parse::{call_node::*, flag::*, pipeline::*, tokens::*};
use crate::parser::parse::{call_node::*, flag::*, operator::*, pipeline::*, tokens::*};
use crate::prelude::*;
use crate::traits::ToDebug;
use crate::{Tag, Tagged, Text};
@ -17,10 +17,9 @@ pub enum TokenNode {
Delimited(Tagged<DelimitedNode>),
Pipeline(Tagged<Pipeline>),
Flag(Tagged<Flag>),
Member(Tag),
Whitespace(Tag),
Error(Tagged<Box<ShellError>>),
Error(Tagged<ShellError>),
}
impl ToDebug for TokenNode {
@ -78,7 +77,7 @@ impl fmt::Debug for DebugTokenNode<'_> {
)
}
TokenNode::Pipeline(pipeline) => write!(f, "{}", pipeline.debug(self.source)),
TokenNode::Error(s) => write!(f, "<error> for {:?}", s.tag().slice(self.source)),
TokenNode::Error(_) => write!(f, "<error>"),
rest => write!(f, "{}", rest.tag().slice(self.source)),
}
}
@ -99,9 +98,8 @@ impl TokenNode {
TokenNode::Delimited(s) => s.tag(),
TokenNode::Pipeline(s) => s.tag(),
TokenNode::Flag(s) => s.tag(),
TokenNode::Member(s) => *s,
TokenNode::Whitespace(s) => *s,
TokenNode::Error(s) => s.tag(),
TokenNode::Error(s) => return s.tag,
}
}
@ -113,7 +111,6 @@ impl TokenNode {
TokenNode::Delimited(d) => d.type_name(),
TokenNode::Pipeline(_) => "pipeline",
TokenNode::Flag(_) => "flag",
TokenNode::Member(_) => "member",
TokenNode::Whitespace(_) => "whitespace",
TokenNode::Error(_) => "error",
}
@ -155,16 +152,37 @@ impl TokenNode {
}
}
pub fn as_block(&self) -> Option<Tagged<&[TokenNode]>> {
pub fn is_pattern(&self) -> bool {
match self {
TokenNode::Token(Tagged {
item: RawToken::GlobPattern,
..
}) => true,
_ => false,
}
}
pub fn is_dot(&self) -> bool {
match self {
TokenNode::Token(Tagged {
item: RawToken::Operator(Operator::Dot),
..
}) => true,
_ => false,
}
}
pub fn as_block(&self) -> Option<(Tagged<&[TokenNode]>, (Tag, Tag))> {
match self {
TokenNode::Delimited(Tagged {
item:
DelimitedNode {
delimiter,
children,
tags,
},
tag,
}) if *delimiter == Delimiter::Brace => Some((&children[..]).tagged(tag)),
}) if *delimiter == Delimiter::Brace => Some(((&children[..]).tagged(tag), *tags)),
_ => None,
}
}
@ -203,7 +221,7 @@ impl TokenNode {
pub fn as_pipeline(&self) -> Result<Pipeline, ShellError> {
match self {
TokenNode::Pipeline(Tagged { item, .. }) => Ok(item.clone()),
_ => Err(ShellError::string("unimplemented")),
_ => Err(ShellError::unimplemented("unimplemented")),
}
}
@ -259,6 +277,7 @@ impl TokenNode {
#[get = "pub(crate)"]
pub struct DelimitedNode {
pub(crate) delimiter: Delimiter,
pub(crate) tags: (Tag, Tag),
pub(crate) children: Vec<TokenNode>,
}
@ -280,19 +299,19 @@ pub enum Delimiter {
}
impl Delimiter {
pub(crate) fn open(&self) -> char {
pub(crate) fn open(&self) -> &'static str {
match self {
Delimiter::Paren => '(',
Delimiter::Brace => '{',
Delimiter::Square => '[',
Delimiter::Paren => "(",
Delimiter::Brace => "{",
Delimiter::Square => "[",
}
}
pub(crate) fn close(&self) -> char {
pub(crate) fn close(&self) -> &'static str {
match self {
Delimiter::Paren => ')',
Delimiter::Brace => '}',
Delimiter::Square => ']',
Delimiter::Paren => ")",
Delimiter::Brace => "}",
Delimiter::Square => "]",
}
}
}

View File

@ -5,7 +5,6 @@ use crate::parser::parse::operator::Operator;
use crate::parser::parse::pipeline::{Pipeline, PipelineElement};
use crate::parser::parse::token_tree::{DelimitedNode, Delimiter, TokenNode};
use crate::parser::parse::tokens::{RawNumber, RawToken};
use crate::parser::parse::unit::Unit;
use crate::parser::CallNode;
use derive_new::new;
use uuid::Uuid;
@ -227,31 +226,6 @@ impl TokenTreeBuilder {
TokenNode::Token(RawToken::Number(input.into()).tagged(tag.into()))
}
pub fn size(int: impl Into<i64>, unit: impl Into<Unit>) -> CurriedToken {
let int = int.into();
let unit = unit.into();
Box::new(move |b| {
let (start_int, end_int) = b.consume(&int.to_string());
let (_, end_unit) = b.consume(unit.as_str());
b.pos = end_unit;
TokenTreeBuilder::tagged_size(
(RawNumber::Int((start_int, end_int, b.anchor).into()), unit),
(start_int, end_unit, b.anchor),
)
})
}
pub fn tagged_size(
input: (impl Into<RawNumber>, impl Into<Unit>),
tag: impl Into<Tag>,
) -> TokenNode {
let (int, unit) = (input.0.into(), input.1.into());
TokenNode::Token(RawToken::Size(int, unit).tagged(tag.into()))
}
pub fn var(input: impl Into<String>) -> CurriedToken {
let input = input.into();
@ -297,19 +271,6 @@ impl TokenTreeBuilder {
TokenNode::Flag(Flag::new(FlagKind::Shorthand, input.into()).tagged(tag.into()))
}
pub fn member(input: impl Into<String>) -> CurriedToken {
let input = input.into();
Box::new(move |b| {
let (start, end) = b.consume(&input);
TokenTreeBuilder::tagged_member((start, end, b.anchor))
})
}
pub fn tagged_member(tag: impl Into<Tag>) -> TokenNode {
TokenNode::Member(tag.into())
}
pub fn call(head: CurriedToken, input: Vec<CurriedToken>) -> CurriedCall {
Box::new(move |b| {
let start = b.pos;
@ -340,58 +301,79 @@ impl TokenTreeBuilder {
CallNode::new(Box::new(head), tail).tagged(tag.into())
}
pub fn parens(input: Vec<CurriedToken>) -> CurriedToken {
Box::new(move |b| {
let (start, _) = b.consume("(");
fn consume_delimiter(
&mut self,
input: Vec<CurriedToken>,
_open: &str,
_close: &str,
) -> (Tag, Tag, Tag, Vec<TokenNode>) {
let (start_open_paren, end_open_paren) = self.consume("(");
let mut output = vec![];
for item in input {
output.push(item(b));
output.push(item(self));
}
let (_, end) = b.consume(")");
let (start_close_paren, end_close_paren) = self.consume(")");
TokenTreeBuilder::tagged_parens(output, (start, end, b.anchor))
let open = Tag::from((start_open_paren, end_open_paren, self.anchor));
let close = Tag::from((start_close_paren, end_close_paren, self.anchor));
let whole = Tag::from((start_open_paren, end_close_paren, self.anchor));
(open, close, whole, output)
}
pub fn parens(input: Vec<CurriedToken>) -> CurriedToken {
Box::new(move |b| {
let (open, close, whole, output) = b.consume_delimiter(input, "(", ")");
TokenTreeBuilder::tagged_parens(output, (open, close), whole)
})
}
pub fn tagged_parens(input: impl Into<Vec<TokenNode>>, tag: impl Into<Tag>) -> TokenNode {
TokenNode::Delimited(DelimitedNode::new(Delimiter::Paren, input.into()).tagged(tag.into()))
pub fn tagged_parens(
input: impl Into<Vec<TokenNode>>,
tags: (Tag, Tag),
tag: impl Into<Tag>,
) -> TokenNode {
TokenNode::Delimited(
DelimitedNode::new(Delimiter::Paren, tags, input.into()).tagged(tag.into()),
)
}
pub fn square(input: Vec<CurriedToken>) -> CurriedToken {
Box::new(move |b| {
let (start, _) = b.consume("[");
let mut output = vec![];
for item in input {
output.push(item(b));
}
let (open, close, whole, tokens) = b.consume_delimiter(input, "[", "]");
let (_, end) = b.consume("]");
TokenTreeBuilder::tagged_square(output, (start, end, b.anchor))
TokenTreeBuilder::tagged_square(tokens, (open, close), whole)
})
}
pub fn tagged_square(input: impl Into<Vec<TokenNode>>, tag: impl Into<Tag>) -> TokenNode {
TokenNode::Delimited(DelimitedNode::new(Delimiter::Square, input.into()).tagged(tag.into()))
pub fn tagged_square(
input: impl Into<Vec<TokenNode>>,
tags: (Tag, Tag),
tag: impl Into<Tag>,
) -> TokenNode {
TokenNode::Delimited(
DelimitedNode::new(Delimiter::Square, tags, input.into()).tagged(tag.into()),
)
}
pub fn braced(input: Vec<CurriedToken>) -> CurriedToken {
Box::new(move |b| {
let (start, _) = b.consume("{ ");
let mut output = vec![];
for item in input {
output.push(item(b));
}
let (open, close, whole, tokens) = b.consume_delimiter(input, "{", "}");
let (_, end) = b.consume(" }");
TokenTreeBuilder::tagged_brace(output, (start, end, b.anchor))
TokenTreeBuilder::tagged_brace(tokens, (open, close), whole)
})
}
pub fn tagged_brace(input: impl Into<Vec<TokenNode>>, tag: impl Into<Tag>) -> TokenNode {
TokenNode::Delimited(DelimitedNode::new(Delimiter::Brace, input.into()).tagged(tag.into()))
pub fn tagged_brace(
input: impl Into<Vec<TokenNode>>,
tags: (Tag, Tag),
tag: impl Into<Tag>,
) -> TokenNode {
TokenNode::Delimited(
DelimitedNode::new(Delimiter::Brace, tags, input.into()).tagged(tag.into()),
)
}
pub fn sp() -> CurriedToken {

View File

@ -1,4 +1,3 @@
use crate::parser::parse::unit::*;
use crate::parser::Operator;
use crate::prelude::*;
use crate::{Tagged, Text};
@ -9,7 +8,6 @@ use std::str::FromStr;
pub enum RawToken {
Number(RawNumber),
Operator(Operator),
Size(RawNumber, Unit),
String(Tag),
Variable(Tag),
ExternalCommand(Tag),
@ -18,6 +16,21 @@ pub enum RawToken {
Bare,
}
impl RawToken {
pub fn type_name(&self) -> &'static str {
match self {
RawToken::Number(_) => "Number",
RawToken::Operator(..) => "operator",
RawToken::String(_) => "String",
RawToken::Variable(_) => "variable",
RawToken::ExternalCommand(_) => "external command",
RawToken::ExternalWord => "external word",
RawToken::GlobPattern => "glob pattern",
RawToken::Bare => "String",
}
}
}
#[derive(Debug, Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Hash)]
pub enum RawNumber {
Int(Tag),
@ -47,22 +60,6 @@ impl RawNumber {
}
}
impl RawToken {
pub fn type_name(&self) -> &'static str {
match self {
RawToken::Number(_) => "Number",
RawToken::Operator(..) => "operator",
RawToken::Size(..) => "Size",
RawToken::String(_) => "String",
RawToken::Variable(_) => "variable",
RawToken::ExternalCommand(_) => "external command",
RawToken::ExternalWord => "external word",
RawToken::GlobPattern => "glob pattern",
RawToken::Bare => "String",
}
}
}
pub type Token = Tagged<RawToken>;
impl Token {
@ -72,6 +69,76 @@ impl Token {
source,
}
}
pub fn extract_number(&self) -> Option<Tagged<RawNumber>> {
match self.item {
RawToken::Number(number) => Some((number).tagged(self.tag)),
_ => None,
}
}
pub fn extract_int(&self) -> Option<(Tag, Tag)> {
match self.item {
RawToken::Number(RawNumber::Int(int)) => Some((int, self.tag)),
_ => None,
}
}
pub fn extract_decimal(&self) -> Option<(Tag, Tag)> {
match self.item {
RawToken::Number(RawNumber::Decimal(decimal)) => Some((decimal, self.tag)),
_ => None,
}
}
pub fn extract_operator(&self) -> Option<Tagged<Operator>> {
match self.item {
RawToken::Operator(operator) => Some(operator.tagged(self.tag)),
_ => None,
}
}
pub fn extract_string(&self) -> Option<(Tag, Tag)> {
match self.item {
RawToken::String(tag) => Some((tag, self.tag)),
_ => None,
}
}
pub fn extract_variable(&self) -> Option<(Tag, Tag)> {
match self.item {
RawToken::Variable(tag) => Some((tag, self.tag)),
_ => None,
}
}
pub fn extract_external_command(&self) -> Option<(Tag, Tag)> {
match self.item {
RawToken::ExternalCommand(tag) => Some((tag, self.tag)),
_ => None,
}
}
pub fn extract_external_word(&self) -> Option<Tag> {
match self.item {
RawToken::ExternalWord => Some(self.tag),
_ => None,
}
}
pub fn extract_glob_pattern(&self) -> Option<Tag> {
match self.item {
RawToken::GlobPattern => Some(self.tag),
_ => None,
}
}
pub fn extract_bare(&self) -> Option<Tag> {
match self.item {
RawToken::Bare => Some(self.tag),
_ => None,
}
}
}
pub struct DebugToken<'a> {

View File

@ -1,5 +1,8 @@
use crate::errors::{ArgumentError, ShellError};
use crate::parser::hir::syntax_shape::{expand_expr, spaced};
use crate::parser::hir::syntax_shape::{
color_fallible_syntax, color_syntax, expand_expr, flat_shape::FlatShape, spaced,
BackoffColoringMode, ColorSyntax, MaybeSpaceShape,
};
use crate::parser::registry::{NamedType, PositionalType, Signature};
use crate::parser::TokensIterator;
use crate::parser::{
@ -153,6 +156,232 @@ pub fn parse_command_tail(
Ok(Some((positional, named)))
}
#[derive(Debug)]
struct ColoringArgs {
vec: Vec<Option<Vec<Tagged<FlatShape>>>>,
}
impl ColoringArgs {
fn new(len: usize) -> ColoringArgs {
let vec = vec![None; len];
ColoringArgs { vec }
}
fn insert(&mut self, pos: usize, shapes: Vec<Tagged<FlatShape>>) {
self.vec[pos] = Some(shapes);
}
fn spread_shapes(self, shapes: &mut Vec<Tagged<FlatShape>>) {
for item in self.vec {
match item {
None => {}
Some(vec) => {
shapes.extend(vec);
}
}
}
}
}
#[derive(Debug, Copy, Clone)]
pub struct CommandTailShape;
impl ColorSyntax for CommandTailShape {
type Info = ();
type Input = Signature;
fn color_syntax<'a, 'b>(
&self,
signature: &Signature,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Tagged<FlatShape>>,
) -> Self::Info {
let mut args = ColoringArgs::new(token_nodes.len());
trace_remaining("nodes", token_nodes.clone(), context.source());
for (name, kind) in &signature.named {
trace!(target: "nu::color_syntax", "looking for {} : {:?}", name, kind);
match kind {
NamedType::Switch => {
match token_nodes.extract(|t| t.as_flag(name, context.source())) {
Some((pos, flag)) => args.insert(pos, vec![flag.color()]),
None => {}
}
}
NamedType::Mandatory(syntax_type) => {
match extract_mandatory(
signature,
name,
token_nodes,
context.source(),
Tag::unknown(),
) {
Err(_) => {
// The mandatory flag didn't exist at all, so there's nothing to color
}
Ok((pos, flag)) => {
let mut shapes = vec![flag.color()];
token_nodes.move_to(pos);
if token_nodes.at_end() {
args.insert(pos, shapes);
token_nodes.restart();
continue;
}
// We can live with unmatched syntax after a mandatory flag
let _ = token_nodes.atomic(|token_nodes| {
color_syntax(&MaybeSpaceShape, token_nodes, context, &mut shapes);
// If the part after a mandatory flag isn't present, that's ok, but we
// should roll back any whitespace we chomped
color_fallible_syntax(
syntax_type,
token_nodes,
context,
&mut shapes,
)
});
args.insert(pos, shapes);
token_nodes.restart();
}
}
}
NamedType::Optional(syntax_type) => {
match extract_optional(name, token_nodes, context.source()) {
Err(_) => {
// The optional flag didn't exist at all, so there's nothing to color
}
Ok(Some((pos, flag))) => {
let mut shapes = vec![flag.color()];
token_nodes.move_to(pos);
if token_nodes.at_end() {
args.insert(pos, shapes);
token_nodes.restart();
continue;
}
// We can live with unmatched syntax after an optional flag
let _ = token_nodes.atomic(|token_nodes| {
color_syntax(&MaybeSpaceShape, token_nodes, context, &mut shapes);
// If the part after a mandatory flag isn't present, that's ok, but we
// should roll back any whitespace we chomped
color_fallible_syntax(
syntax_type,
token_nodes,
context,
&mut shapes,
)
});
args.insert(pos, shapes);
token_nodes.restart();
}
Ok(None) => {
token_nodes.restart();
}
}
}
};
}
trace_remaining("after named", token_nodes.clone(), context.source());
for arg in &signature.positional {
trace!("Processing positional {:?}", arg);
match arg {
PositionalType::Mandatory(..) => {
if token_nodes.at_end() {
break;
}
}
PositionalType::Optional(..) => {
if token_nodes.at_end() {
break;
}
}
}
let mut shapes = vec![];
let pos = token_nodes.pos(false);
match pos {
None => break,
Some(pos) => {
// We can live with an unmatched positional argument. Hopefully it will be
// matched by a future token
let _ = token_nodes.atomic(|token_nodes| {
color_syntax(&MaybeSpaceShape, token_nodes, context, &mut shapes);
// If no match, we should roll back any whitespace we chomped
color_fallible_syntax(
&arg.syntax_type(),
token_nodes,
context,
&mut shapes,
)?;
args.insert(pos, shapes);
Ok(())
});
}
}
}
trace_remaining("after positional", token_nodes.clone(), context.source());
if let Some(syntax_type) = signature.rest_positional {
loop {
if token_nodes.at_end_possible_ws() {
break;
}
let pos = token_nodes.pos(false);
match pos {
None => break,
Some(pos) => {
let mut shapes = vec![];
// If any arguments don't match, we'll fall back to backoff coloring mode
let result = token_nodes.atomic(|token_nodes| {
color_syntax(&MaybeSpaceShape, token_nodes, context, &mut shapes);
// If no match, we should roll back any whitespace we chomped
color_fallible_syntax(&syntax_type, token_nodes, context, &mut shapes)?;
args.insert(pos, shapes);
Ok(())
});
match result {
Err(_) => break,
Ok(_) => continue,
}
}
}
}
}
args.spread_shapes(shapes);
// Consume any remaining tokens with backoff coloring mode
color_syntax(&BackoffColoringMode, token_nodes, context, shapes);
shapes.sort_by(|a, b| a.tag.span.start().cmp(&b.tag.span.start()));
}
}
fn extract_switch(name: &str, tokens: &mut hir::TokensIterator<'_>, source: &Text) -> Option<Flag> {
tokens
.extract(|t| t.as_flag(name, source))
@ -200,6 +429,7 @@ fn extract_optional(
pub fn trace_remaining(desc: &'static str, tail: hir::TokensIterator<'_>, source: &Text) {
trace!(
target: "nu::expand_args",
"{} = {:?}",
desc,
itertools::join(

View File

@ -32,7 +32,7 @@ pub fn serve_plugin(plugin: &mut dyn Plugin) {
let input = match input {
Some(arg) => std::fs::read_to_string(arg),
None => {
send_response(ShellError::string(format!("No input given.")));
send_response(ShellError::untagged_runtime_error("No input given."));
return;
}
};
@ -64,7 +64,7 @@ pub fn serve_plugin(plugin: &mut dyn Plugin) {
return;
}
e => {
send_response(ShellError::string(format!(
send_response(ShellError::untagged_runtime_error(format!(
"Could not handle plugin message: {} {:?}",
input, e
)));
@ -102,7 +102,7 @@ pub fn serve_plugin(plugin: &mut dyn Plugin) {
break;
}
e => {
send_response(ShellError::string(format!(
send_response(ShellError::untagged_runtime_error(format!(
"Could not handle plugin message: {} {:?}",
input, e
)));
@ -111,7 +111,7 @@ pub fn serve_plugin(plugin: &mut dyn Plugin) {
}
}
e => {
send_response(ShellError::string(format!(
send_response(ShellError::untagged_runtime_error(format!(
"Could not handle plugin message: {:?}",
e,
)));

View File

@ -1,7 +1,7 @@
use itertools::Itertools;
use nu::{
serve_plugin, CallInfo, Plugin, ReturnSuccess, ReturnValue, ShellError, Signature, SyntaxShape,
Tagged, Value,
Tagged, TaggedItem, Value,
};
pub type ColumnPath = Vec<Tagged<String>>;
@ -25,21 +25,27 @@ impl Add {
Some(f) => match obj.insert_data_at_column_path(value_tag, &f, v) {
Some(v) => return Ok(v),
None => {
return Err(ShellError::string(format!(
return Err(ShellError::labeled_error(
format!(
"add could not find place to insert field {:?} {}",
obj,
f.iter().map(|i| &i.item).join(".")
)))
),
"column name",
value_tag,
))
}
},
None => Err(ShellError::string(
None => Err(ShellError::labeled_error(
"add needs a column name when adding a value to a table",
"column name",
value_tag,
)),
},
x => Err(ShellError::string(format!(
"Unrecognized type in stream: {:?}",
x
))),
(value, _) => Err(ShellError::type_error(
"row",
value.type_name().tagged(value_tag),
)),
}
}
}
@ -64,12 +70,7 @@ impl Plugin for Add {
self.field = Some(table.as_column_path()?.item);
}
_ => {
return Err(ShellError::string(format!(
"Unrecognized type in params: {:?}",
args[0]
)))
}
value => return Err(ShellError::type_error("table", value.tagged_type_name())),
}
match &args[1] {
Tagged { item: v, .. } => {

View File

@ -3,7 +3,7 @@ use nu::{
Tagged, Value,
};
pub type ColumnPath = Vec<Tagged<String>>;
pub type ColumnPath = Tagged<Vec<Tagged<String>>>;
struct Edit {
field: Option<ColumnPath>,
@ -24,19 +24,22 @@ impl Edit {
Some(f) => match obj.replace_data_at_column_path(value_tag, &f, v) {
Some(v) => return Ok(v),
None => {
return Err(ShellError::string(
return Err(ShellError::labeled_error(
"edit could not find place to insert column",
"column name",
f.tag,
))
}
},
None => Err(ShellError::string(
None => Err(ShellError::untagged_runtime_error(
"edit needs a column when changing a value in a table",
)),
},
x => Err(ShellError::string(format!(
"Unrecognized type in stream: {:?}",
x
))),
_ => Err(ShellError::labeled_error(
"Unrecognized type in stream",
"original value",
value_tag,
)),
}
}
}
@ -57,14 +60,9 @@ impl Plugin for Edit {
item: Value::Table(_),
..
} => {
self.field = Some(table.as_column_path()?.item);
}
_ => {
return Err(ShellError::string(format!(
"Unrecognized type in params: {:?}",
args[0]
)))
self.field = Some(table.as_column_path()?);
}
value => return Err(ShellError::type_error("table", value.tagged_type_name())),
}
match &args[1] {
Tagged { item: v, .. } => {

View File

@ -25,8 +25,10 @@ impl Embed {
});
Ok(())
}
None => Err(ShellError::string(
None => Err(ShellError::labeled_error(
"embed needs a field when embedding a value",
"original value",
value.tag,
)),
},
}
@ -52,12 +54,7 @@ impl Plugin for Embed {
self.field = Some(s.clone());
self.values = Vec::new();
}
_ => {
return Err(ShellError::string(format!(
"Unrecognized type in params: {:?}",
args[0]
)))
}
value => return Err(ShellError::type_error("string", value.tagged_type_name())),
}
}

View File

@ -14,7 +14,7 @@ pub enum SemVerAction {
Patch,
}
pub type ColumnPath = Vec<Tagged<String>>;
pub type ColumnPath = Tagged<Vec<Tagged<String>>>;
struct Inc {
field: Option<ColumnPath>,
@ -90,7 +90,11 @@ impl Inc {
let replacement = match value.item.get_data_by_column_path(value.tag(), f) {
Some(result) => self.inc(result.map(|x| x.clone()))?,
None => {
return Err(ShellError::string("inc could not find field to replace"))
return Err(ShellError::labeled_error(
"inc could not find field to replace",
"column name",
f.tag,
))
}
};
match value.item.replace_data_at_column_path(
@ -100,18 +104,22 @@ impl Inc {
) {
Some(v) => return Ok(v),
None => {
return Err(ShellError::string("inc could not find field to replace"))
return Err(ShellError::labeled_error(
"inc could not find field to replace",
"column name",
f.tag,
))
}
}
}
None => Err(ShellError::string(
None => Err(ShellError::untagged_runtime_error(
"inc needs a field when incrementing a column in a table",
)),
},
x => Err(ShellError::string(format!(
"Unrecognized type in stream: {:?}",
x
))),
_ => Err(ShellError::type_error(
"incrementable value",
value.tagged_type_name(),
)),
}
}
}
@ -145,14 +153,9 @@ impl Plugin for Inc {
item: Value::Table(_),
..
} => {
self.field = Some(table.as_column_path()?.item);
}
_ => {
return Err(ShellError::string(format!(
"Unrecognized type in params: {:?}",
arg
)))
self.field = Some(table.as_column_path()?);
}
value => return Err(ShellError::type_error("table", value.tagged_type_name())),
}
}
}
@ -163,7 +166,11 @@ impl Plugin for Inc {
match &self.error {
Some(reason) => {
return Err(ShellError::string(format!("{}: {}", reason, Inc::usage())))
return Err(ShellError::untagged_runtime_error(format!(
"{}: {}",
reason,
Inc::usage()
)))
}
None => Ok(vec![]),
}
@ -308,7 +315,7 @@ mod tests {
assert_eq!(
plugin
.field
.map(|f| f.into_iter().map(|f| f.item).collect()),
.map(|f| f.iter().map(|f| f.item.clone()).collect()),
Some(vec!["package".to_string(), "version".to_string()])
);
}

View File

@ -35,11 +35,12 @@ impl Plugin for Match {
} => {
self.column = s.clone();
}
_ => {
return Err(ShellError::string(format!(
"Unrecognized type in params: {:?}",
args[0]
)));
Tagged { tag, .. } => {
return Err(ShellError::labeled_error(
"Unrecognized type in params",
"value",
tag,
));
}
}
match &args[1] {
@ -49,11 +50,12 @@ impl Plugin for Match {
} => {
self.regex = Regex::new(s).unwrap();
}
_ => {
return Err(ShellError::string(format!(
"Unrecognized type in params: {:?}",
args[1]
)));
Tagged { tag, .. } => {
return Err(ShellError::labeled_error(
"Unrecognized type in params",
"value",
tag,
));
}
}
}
@ -65,7 +67,7 @@ impl Plugin for Match {
match &input {
Tagged {
item: Value::Row(dict),
..
tag,
} => {
if let Some(val) = dict.entries.get(&self.column) {
match val {
@ -75,22 +77,20 @@ impl Plugin for Match {
} => {
flag = self.regex.is_match(s);
}
_ => {
return Err(ShellError::string(format!(
"value is not a string! {:?}",
&val
)));
Tagged { tag, .. } => {
return Err(ShellError::labeled_error("expected string", "value", tag));
}
}
} else {
return Err(ShellError::string(format!(
"column not in row! {:?} {:?}",
&self.column, dict
)));
return Err(ShellError::labeled_error(
format!("column not in row! {:?} {:?}", &self.column, dict),
"row",
tag,
));
}
}
_ => {
return Err(ShellError::string(format!("Not a row! {:?}", &input)));
Tagged { tag, .. } => {
return Err(ShellError::labeled_error("Expected row", "value", tag));
}
}
if flag {

View File

@ -105,20 +105,24 @@ impl Str {
) {
Some(v) => return Ok(v),
None => {
return Err(ShellError::string("str could not find field to replace"))
return Err(ShellError::type_error(
"column name",
value.tagged_type_name(),
))
}
}
}
None => Err(ShellError::string(format!(
None => Err(ShellError::untagged_runtime_error(format!(
"{}: {}",
"str needs a column when applied to a value in a row",
Str::usage()
))),
},
x => Err(ShellError::string(format!(
"Unrecognized type in stream: {:?}",
x
))),
_ => Err(ShellError::labeled_error(
"Unrecognized type in stream",
value.type_name(),
value.tag,
)),
}
}
}
@ -167,10 +171,11 @@ impl Plugin for Str {
self.field = Some(table.as_column_path()?.item);
}
_ => {
return Err(ShellError::string(format!(
"Unrecognized type in params: {:?}",
possible_field
)))
return Err(ShellError::labeled_error(
"Unrecognized type in params",
possible_field.type_name(),
possible_field.tag,
))
}
}
}
@ -187,7 +192,11 @@ impl Plugin for Str {
match &self.error {
Some(reason) => {
return Err(ShellError::string(format!("{}: {}", reason, Str::usage())))
return Err(ShellError::untagged_runtime_error(format!(
"{}: {}",
reason,
Str::usage()
)))
}
None => Ok(vec![]),
}

View File

@ -28,9 +28,11 @@ impl Sum {
self.total = Some(value.clone());
Ok(())
}
_ => Err(ShellError::string(format!(
"Could not sum non-integer or unrelated types"
))),
_ => Err(ShellError::labeled_error(
"Could not sum non-integer or unrelated types",
"source",
value.tag,
)),
}
}
Value::Primitive(Primitive::Bytes(b)) => {
@ -47,15 +49,18 @@ impl Sum {
self.total = Some(value);
Ok(())
}
_ => Err(ShellError::string(format!(
"Could not sum non-integer or unrelated types"
))),
_ => Err(ShellError::labeled_error(
"Could not sum non-integer or unrelated types",
"source",
value.tag,
)),
}
}
x => Err(ShellError::string(format!(
"Unrecognized type in stream: {:?}",
x
))),
x => Err(ShellError::labeled_error(
format!("Unrecognized type in stream: {:?}", x),
"source",
value.tag,
)),
}
}
}

View File

@ -1,3 +1,13 @@
#[macro_export]
macro_rules! return_err {
($expr:expr) => {
match $expr {
Err(_) => return,
Ok(expr) => expr,
};
};
}
#[macro_export]
macro_rules! stream {
($($expr:expr),*) => {{

View File

@ -145,7 +145,7 @@ impl Shell for FilesystemShell {
source.tag(),
));
} else {
return Err(ShellError::string("Invalid pattern."));
return Err(ShellError::untagged_runtime_error("Invalid pattern."));
}
}
};

View File

@ -1,11 +1,11 @@
use crate::context::Context;
use crate::parser::hir::syntax_shape::{color_fallible_syntax, FlatShape, PipelineShape};
use crate::parser::hir::TokensIterator;
use crate::parser::nom_input;
use crate::parser::parse::token_tree::TokenNode;
use crate::parser::parse::tokens::RawToken;
use crate::parser::{Pipeline, PipelineElement};
use crate::shell::shell_manager::ShellManager;
use crate::Tagged;
use crate::{Tag, Tagged, TaggedItem, Text};
use ansi_term::Color;
use log::trace;
use rustyline::completion::Completer;
use rustyline::error::ReadlineError;
use rustyline::highlight::Highlighter;
@ -13,12 +13,12 @@ use rustyline::hint::Hinter;
use std::borrow::Cow::{self, Owned};
pub(crate) struct Helper {
helper: ShellManager,
context: Context,
}
impl Helper {
pub(crate) fn new(helper: ShellManager) -> Helper {
Helper { helper }
pub(crate) fn new(context: Context) -> Helper {
Helper { context }
}
}
@ -30,7 +30,7 @@ impl Completer for Helper {
pos: usize,
ctx: &rustyline::Context<'_>,
) -> Result<(usize, Vec<rustyline::completion::Pair>), ReadlineError> {
self.helper.complete(line, pos, ctx)
self.context.shell_manager.complete(line, pos, ctx)
}
}
@ -53,7 +53,7 @@ impl Completer for Helper {
impl Hinter for Helper {
fn hint(&self, line: &str, pos: usize, ctx: &rustyline::Context<'_>) -> Option<String> {
self.helper.hint(line, pos, ctx)
self.context.shell_manager.hint(line, pos, ctx)
}
}
@ -78,20 +78,42 @@ impl Highlighter for Helper {
Ok(v) => v,
};
let Pipeline { parts } = pipeline;
let mut iter = parts.into_iter();
let tokens = vec![TokenNode::Pipeline(pipeline.clone().tagged(v.tag()))];
let mut tokens = TokensIterator::all(&tokens[..], v.tag());
loop {
match iter.next() {
None => {
return Cow::Owned(out);
}
Some(token) => {
let styled = paint_pipeline_element(&token, line);
out.push_str(&styled.to_string());
}
}
let text = Text::from(line);
let expand_context = self
.context
.expand_context(&text, Tag::from((0, line.len() - 1, uuid::Uuid::nil())));
let mut shapes = vec![];
// We just constructed a token list that only contains a pipeline, so it can't fail
color_fallible_syntax(&PipelineShape, &mut tokens, &expand_context, &mut shapes)
.unwrap();
trace!(target: "nu::shapes",
"SHAPES :: {:?}",
shapes.iter().map(|shape| shape.item).collect::<Vec<_>>()
);
for shape in shapes {
let styled = paint_flat_shape(shape, line);
out.push_str(&styled);
}
Cow::Owned(out)
// loop {
// match iter.next() {
// None => {
// return Cow::Owned(out);
// }
// Some(token) => {
// let styled = paint_pipeline_element(&token, line);
// out.push_str(&styled.to_string());
// }
// }
// }
}
}
}
@ -101,80 +123,55 @@ impl Highlighter for Helper {
}
}
fn paint_token_node(token_node: &TokenNode, line: &str) -> String {
let styled = match token_node {
TokenNode::Call(..) => Color::Cyan.bold().paint(token_node.tag().slice(line)),
TokenNode::Nodes(..) => Color::Green.bold().paint(token_node.tag().slice(line)),
TokenNode::Whitespace(..) => Color::White.normal().paint(token_node.tag().slice(line)),
TokenNode::Flag(..) => Color::Black.bold().paint(token_node.tag().slice(line)),
TokenNode::Member(..) => Color::Yellow.bold().paint(token_node.tag().slice(line)),
TokenNode::Error(..) => Color::Red.bold().paint(token_node.tag().slice(line)),
TokenNode::Delimited(..) => Color::White.paint(token_node.tag().slice(line)),
TokenNode::Pipeline(..) => Color::Blue.normal().paint(token_node.tag().slice(line)),
TokenNode::Token(Tagged {
item: RawToken::Number(..),
..
}) => Color::Purple.bold().paint(token_node.tag().slice(line)),
TokenNode::Token(Tagged {
item: RawToken::Size(..),
..
}) => Color::Purple.bold().paint(token_node.tag().slice(line)),
TokenNode::Token(Tagged {
item: RawToken::GlobPattern,
..
}) => Color::Cyan.normal().paint(token_node.tag().slice(line)),
TokenNode::Token(Tagged {
item: RawToken::String(..),
..
}) => Color::Green.normal().paint(token_node.tag().slice(line)),
TokenNode::Token(Tagged {
item: RawToken::Variable(..),
..
}) => Color::Yellow.bold().paint(token_node.tag().slice(line)),
TokenNode::Token(Tagged {
item: RawToken::Bare,
..
}) => Color::Green.normal().paint(token_node.tag().slice(line)),
TokenNode::Token(Tagged {
item: RawToken::ExternalCommand(..),
..
}) => Color::Cyan.bold().paint(token_node.tag().slice(line)),
TokenNode::Token(Tagged {
item: RawToken::ExternalWord,
..
}) => Color::Black.bold().paint(token_node.tag().slice(line)),
TokenNode::Token(Tagged {
item: RawToken::Operator(..),
..
}) => Color::Black.bold().paint(token_node.tag().slice(line)),
};
#[allow(unused)]
fn vec_tag<T>(input: Vec<Tagged<T>>) -> Option<Tag> {
let mut iter = input.iter();
let first = iter.next()?.tag;
let last = iter.last();
styled.to_string()
Some(match last {
None => first,
Some(last) => first.until(last.tag),
})
}
fn paint_pipeline_element(pipeline_element: &PipelineElement, line: &str) -> String {
let mut styled = String::new();
if let Some(_) = pipeline_element.pipe {
styled.push_str(&Color::Purple.paint("|"));
fn paint_flat_shape(flat_shape: Tagged<FlatShape>, line: &str) -> String {
let style = match &flat_shape.item {
FlatShape::OpenDelimiter(_) => Color::White.normal(),
FlatShape::CloseDelimiter(_) => Color::White.normal(),
FlatShape::ItVariable => Color::Purple.bold(),
FlatShape::Variable => Color::Purple.normal(),
FlatShape::Operator => Color::Yellow.normal(),
FlatShape::Dot => Color::White.normal(),
FlatShape::InternalCommand => Color::Cyan.bold(),
FlatShape::ExternalCommand => Color::Cyan.normal(),
FlatShape::ExternalWord => Color::Black.bold(),
FlatShape::BareMember => Color::Yellow.bold(),
FlatShape::StringMember => Color::Yellow.bold(),
FlatShape::String => Color::Green.normal(),
FlatShape::Path => Color::Cyan.normal(),
FlatShape::GlobPattern => Color::Cyan.bold(),
FlatShape::Word => Color::Green.normal(),
FlatShape::Pipe => Color::Purple.bold(),
FlatShape::Flag => Color::Black.bold(),
FlatShape::ShorthandFlag => Color::Black.bold(),
FlatShape::Int => Color::Purple.bold(),
FlatShape::Decimal => Color::Purple.bold(),
FlatShape::Whitespace => Color::White.normal(),
FlatShape::Error => Color::Red.bold(),
FlatShape::Size { number, unit } => {
let number = number.slice(line);
let unit = unit.slice(line);
return format!(
"{}{}",
Color::Purple.bold().paint(number),
Color::Cyan.bold().paint(unit)
);
}
};
let mut tokens =
TokensIterator::new(&pipeline_element.tokens, pipeline_element.tokens.tag, false);
let head = tokens.next();
match head {
None => return styled,
Some(head) => {
styled.push_str(&Color::Cyan.bold().paint(head.tag().slice(line)).to_string())
}
}
for token in tokens {
styled.push_str(&paint_token_node(token, line));
}
styled.to_string()
let body = flat_shape.tag.slice(line);
style.paint(body).to_string()
}
impl rustyline::Helper for Helper {}