Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parsing User Error for InvalidToken #21

Merged
merged 77 commits into from
May 23, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
77 commits
Select commit Hold shift + click to select a range
e61ae74
Lock the Cargo manifest\!
adjivas Apr 12, 2018
5c5f5c2
Merge pull request #2 from ulysseB/master
adjivas Apr 12, 2018
1d96bfe
pre-implementation of flex
adjivas Apr 17, 2018
9bd95b7
reconfigure travis as nightly
adjivas Apr 17, 2018
3bff133
naive first implementation of C part exh-lang's lexer
adjivas Apr 18, 2018
b0c4594
C union representation
adjivas Apr 18, 2018
7c5c3fc
Enumeration CmpOp
adjivas Apr 18, 2018
0f24f57
add yystr rules and incomplet test
adjivas Apr 19, 2018
ea3cbdd
fixe a bit of F/lex syntax with first short test
adjivas Apr 20, 2018
2ebdcd0
add second test for code's token
adjivas Apr 20, 2018
b208619
reset test as a complet list of token, add of a null character to the…
adjivas Apr 24, 2018
473f57c
fixe test
adjivas Apr 25, 2018
86ad6fe
not need null character, solve it with yy_scan_bytes function
adjivas Apr 25, 2018
499cff2
implementation of c comment
adjivas Apr 25, 2018
9bfb3c6
naive implementation of DOC token
adjivas Apr 25, 2018
12d8251
Blank Token as mute (not needed by Gen)
adjivas Apr 25, 2018
a168023
mute c warning of generated by f/lex's unused variables
adjivas Apr 25, 2018
c428e28
remove unnecessary ffi prefix
adjivas Apr 25, 2018
bf84ae3
allow dead code for ffi
adjivas Apr 25, 2018
05ab085
allow unused {function,variable,parameter} for {F/lex,C}
adjivas Apr 25, 2018
8b5294b
fixe dependency gcc as cc, see issue #3
adjivas Apr 26, 2018
8d2387c
rewrite DOC token with a start condition
adjivas Apr 26, 2018
e054856
fixe // comment
adjivas Apr 26, 2018
2493f7f
retarget test for cc_test
adjivas Apr 26, 2018
d29bc3b
comment rule
adjivas Apr 26, 2018
c9e49e6
ast doc recursive concat
adjivas Apr 26, 2018
341c17b
enable all compilation
adjivas Apr 26, 2018
a22991c
prepare travis PR
adjivas Apr 26, 2018
7500bf9
add feature lex, need this feature to generate the lexer, or else a d…
adjivas Apr 26, 2018
86dd15e
by default, travis will regenerate the lexer too
adjivas Apr 26, 2018
989621f
Merge branch 'master' into lex
Apr 26, 2018
d806c24
Update comment liking to lex doc.
Apr 27, 2018
0fe1301
sync
adjivas May 2, 2018
9710ce2
Merge pull request #4 from ulysseB/master
adjivas May 2, 2018
4e7a84c
replace drain_filter by all,filter for stabilize rust as stable channel
adjivas May 2, 2018
f217491
Merge branch 'lex' of https://github.com/adjivas/telamon into lex
adjivas May 2, 2018
478505a
retain fixe is_subset_of_def as reverse, remark, this line is uncover…
adjivas May 2, 2018
bad658d
fixe coding style, closure as for, not as !
adjivas May 3, 2018
a3c5784
dereferencing var
adjivas May 3, 2018
1e72a13
add benchmark for lexer
adjivas May 9, 2018
827c65f
criterion manifest
adjivas May 11, 2018
aa340e9
manifest travis now checks rust as stable
adjivas May 11, 2018
672aad3
fixe auto example according to issue #5330
adjivas May 16, 2018
2bb3337
add poc.l to dependency condition of Cargo rebuilt
adjivas May 16, 2018
24d5c74
enable f/lex feature yylineno, implemente YY_EXTRA and YY_USER, lalrp…
adjivas May 16, 2018
ab75b8e
naive reforcement lexer test with Position check, next check will add…
adjivas May 16, 2018
21e02d6
move enumerations from l to include/h
adjivas May 17, 2018
8d06681
documentation renforcement
adjivas May 17, 2018
3fe2051
compile lexer with source include for C header
adjivas May 17, 2018
50ce1cc
rewrite define extra-type by option extra-type and implemente line/co…
adjivas May 17, 2018
38555b3
recompile lexer
adjivas May 17, 2018
ec6da00
add case for line/colum of line/doc and c_comment
adjivas May 17, 2018
2aef8fd
the destructor of lexer clears the disingenuous yylineno
adjivas May 17, 2018
99989ff
InvalidToken is now returned as error from the lexer part
adjivas May 17, 2018
69a5d0d
move Invalid from Token to LexicalError
adjivas May 18, 2018
95c38a5
add macro pub_generated_file to create a public module for extern usa…
adjivas May 18, 2018
c79588b
parse is now a public module, process/file can now returns a parser e…
adjivas May 18, 2018
2889ac0
invalidToken checks for lexer\&parser
adjivas May 18, 2018
997e3d9
unwrap returns of process/file
adjivas May 18, 2018
39a59f0
solve merge
adjivas May 18, 2018
dcf84cf
Merge pull request #5 from ulysseB/master
adjivas May 18, 2018
74f640a
unwrap for generate/file
adjivas May 18, 2018
74f6990
move common module into example
adjivas May 18, 2018
b76878c
Merge branch 'parsing' of https://github.com/adjivas/telamon into par…
adjivas May 18, 2018
f251f3d
line number is now set from the lexer constructer
adjivas May 18, 2018
a56f0b3
remove unecessary manifest autoexample field
adjivas May 18, 2018
127f961
move exh.c into src
adjivas May 18, 2018
8015e0f
rename Span as Spanned
adjivas May 18, 2018
c7bed5d
LALRPOP documentation
adjivas May 18, 2018
d67714b
generated_file as (pub) macro; danielkeep.github.io/tlborm/book/pat-v…
May 20, 2018
038a016
keyword pub
May 20, 2018
125fe23
Merge pull request #6 from adjivas/master
adjivas May 20, 2018
9f2438b
add the Error trait for LexicalError
adjivas May 22, 2018
aaf0a4d
process function too returns filename
adjivas May 22, 2018
22ddba3
token derive from clone and lexer has now a struct Span
adjivas May 23, 2018
5cb8977
naive test for ProcessError
adjivas May 23, 2018
2dd6ea5
change output error and add cause enumeration for ProcessError
adjivas May 23, 2018
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion build.rs
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ fn main() {

add_dependency(exh_file);
let exh_out = Path::new(&out_dir).join("choices.rs");
telamon_gen::process_file(&Path::new(exh_file), &exh_out, cfg!(feature="format_exh"));
telamon_gen::process_file(&Path::new(exh_file), &exh_out, cfg!(feature="format_exh")).unwrap();
if cfg!(feature="cuda") { compile_link_cuda(); }

if cfg!(feature = "mppa") {
Expand Down
35 changes: 0 additions & 35 deletions examples/common.rs

This file was deleted.

31 changes: 30 additions & 1 deletion examples/sgemm_low.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,36 @@ extern crate itertools;
extern crate log;
extern crate rayon;

mod common;
mod common {
/// Generates the code for the best candidate in the search space.
pub fn gen_best<'a>(search_space: Vec<SearchSpace>,
context: &'a Context,
out: &str) {
let conf = explorer::Config::read();
let begin_time = std::time::Instant::now();
let best_opt = explorer::find_best(&conf, context, search_space);
let duration = std::time::Instant::now() - begin_time;
warn!("Search completed in {}s", duration.as_secs());
match best_opt {
Some(best) => {
let mut file = std::fs::File::create(out).unwrap();
context.device().gen_code(&best, &mut file)
}
None => println!("Did not find any well suited candidate before timeout"),
}
}

/// Generate a name for the output file.
pub fn file_name(name: &str,
_: ir::Type,
sizes: &[i32],
instantiated: bool) -> String {
const PATH: &str = "examples/out/";
std::fs::create_dir_all(PATH).unwrap();
let sizes = sizes.iter().format_with("", |i, f| f(&format_args!("_{}", i)));
format!("{}{}_{}{}.c", PATH, name, instantiated, sizes)
}
}

#[allow(unused_imports)]
use telamon::{explorer, helper, ir};
Expand Down
7 changes: 7 additions & 0 deletions telamon-gen/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,13 @@ lex = []
doc = false
name = "cli_gen"

[[bench]]
name = "lexer"
harness = false

[dev-dependencies]
criterion = "0.2"

[build-dependencies]
lalrpop = "0.14"
cc = "1.0.12"
Expand Down
29 changes: 29 additions & 0 deletions telamon-gen/benches/lexer.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
#[macro_use]
extern crate criterion;
extern crate telamon_gen;

use criterion::Criterion;

use telamon_gen::lexer;

use std::fs;
use std::ffi::OsStr;

fn criterion_benchmark(c: &mut Criterion) {
let entries = fs::read_dir("cc_tests/src/").unwrap();
for entry in entries {
if let Ok(entry) = entry {
if entry.path().extension().eq(&Some(OsStr::new("exh"))) {
let path = entry.path();
let mut input = fs::File::open(&path).unwrap();
let mut name = String::from("lexer ");
name.push_str(path.file_stem().unwrap().to_str().unwrap());

c.bench_function(&name, move |b| b.iter(|| lexer::Lexer::new(&mut input)));
}
}
}
}

criterion_group!(benches, criterion_benchmark);
criterion_main!(benches);
10 changes: 7 additions & 3 deletions telamon-gen/build.rs
Original file line number Diff line number Diff line change
Expand Up @@ -6,22 +6,26 @@ extern crate lalrpop;
fn add_dependency(dep: &str) { println!("cargo:rerun-if-changed={}", dep); }

fn main() {
// Compile the lexer.(`LEX="flex" cargo build --features "lex"`)
// Regenerate the lexer.(`LEX="flex" cargo build --features "lex"`)
#[cfg(feature = "lex")]
{
use std::{env,process::Command};

// Generate the lexer .
add_dependency("src/poc.l");
let bin = env::var("LEX").unwrap_or(String::from("flex"));

Command::new(bin)
.arg("-oexh.c")
.arg("-osrc/exh.c")
.arg("src/exh.l")
.status()
.expect("failed to execute Flex's process");
}

// Compile the lexer .
cc::Build::new()
.file("exh.c")
.file("src/exh.c")
.include("src")
.flag("-Wno-unused-parameter")
.flag("-Wno-unused-variable")
.flag_if_supported("-Wno-unused-function")
Expand Down
2 changes: 1 addition & 1 deletion telamon-gen/cc_tests/build.rs
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,6 @@ fn main() {
let file_name = src_path.file_name().unwrap();
println!("cargo:rerun-if-changed={}", file_name.to_str().unwrap());
let dst_path = Path::new(&out_dir).join(&file_name).with_extension("rs");
telamon_gen::process_file(&src_path, &dst_path, !cfg!(feature="noformat_exh"));
telamon_gen::process_file(&src_path, &dst_path, !cfg!(feature="noformat_exh")).unwrap();
}
}
13 changes: 12 additions & 1 deletion telamon-gen/src/bin/cli_gen.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,18 @@
extern crate telamon_gen;
extern crate env_logger;

use std::process;
use std::path::Path;

fn main() {
env_logger::init();
telamon_gen::process(&mut std::io::stdin(), &mut std::io::stdout(), true);
if let Err(process_error) = telamon_gen::process(
&mut std::io::stdin(),
&mut std::io::stdout(),
true,
&Path::new("exh")
) {
eprintln!("error: {}", process_error);
process::exit(-1);
}
}
103 changes: 103 additions & 0 deletions telamon-gen/src/error.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
use super::lexer;
use super::lalrpop_util::*;

use std::{path, fmt};
use std::error::Error;

#[derive(Debug)]
pub enum Cause {
/// Lalrpop
Parse(ParseError<lexer::Position,
lexer::Token,
lexer::LexicalError>),
/// Will be remplaced by field for Ast [...]
Other,
}

#[derive(Debug)]
pub struct ProcessError<'a> {
/// Display of filename.
pub path: path::Display<'a>,
/// Position of lexeme.
pub span: Option<lexer::Span>,
cause: Cause,
}

impl <'a>From<(path::Display<'a>,
ParseError<lexer::Position,
lexer::Token,
lexer::LexicalError>
)> for ProcessError<'a> {
fn from((path, parse): (path::Display<'a>,
ParseError<lexer::Position,
lexer::Token,
lexer::LexicalError>
)) -> Self {
match parse {
ParseError::InvalidToken { location }
=> ProcessError {
path: path,
span: Some(lexer::Span { leg: location, ..Default::default() }),
cause: Cause::Parse(parse),
},
ParseError::UnrecognizedToken { token: None, .. }
=> ProcessError {
path: path,
span: None,
cause: Cause::Parse(parse),
},
ParseError::UnrecognizedToken { token: Some((l, .., e)), .. } |
ParseError::ExtraToken { token: (l, .., e) } |
ParseError::User { error: lexer::LexicalError::UnexpectedToken(l, .., e) } |
ParseError::User { error: lexer::LexicalError::InvalidToken(l, .., e) }
=> ProcessError {
path: path,
span: Some(lexer::Span { leg: l, end: Some(e) }),
cause: Cause::Parse(parse),
},
}
}
}

impl <'a> fmt::Display for ProcessError<'a> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match self {
ProcessError { path, span, cause: Cause::Parse(
ParseError::UnrecognizedToken {
token: Some((_, ref token, _)), ..
}), ..} |
ProcessError { path, span, cause: Cause::Parse(ParseError::ExtraToken {
token: (_, ref token, _)
}), ..} |
ProcessError { path, span, cause: Cause::Parse(
ParseError::User {
error: lexer::LexicalError::UnexpectedToken(_, ref token, _)
}), ..} |
ProcessError { path, span, cause: Cause::Parse(
ParseError::User {
error: lexer::LexicalError::InvalidToken(_, ref token, _)
}), ..} => {
if let Some(span) = span {
write!(f, "{}, {} -> {}", token, span, path)
} else {
write!(f, "{} -> {}", token, path)
}
},
_ => Ok(()),
}
}
}

impl <'a>Error for ProcessError<'a> {
fn description(&self) -> &str {
"Process error"
}

fn cause(&self) -> Option<&Error> {
if let Cause::Parse(ref parse) = self.cause {
parse.cause()
} else {
None
}
}
}
Loading