November 6th, 2024

Why I love Rust for tokenising and parsing

The author develops a Rust-based static analysis tool for SQL, named sqleibniz, focusing on syntax checks and validation for SQLite, emphasizing error message quality and employing table-driven tests.

Read original articleLink Icon
Why I love Rust for tokenising and parsing

The blog post discusses the author's experience using Rust for developing a static analysis tool for SQL, named sqleibniz, specifically targeting the SQLite dialect. The tool aims to perform syntax checks and validate the existence of tables, columns, and functions in SQL input. The author emphasizes the importance of providing high-quality error messages with context and explanations. The development process involves creating a tokenizer and parser for SQL, leveraging Rust's features such as macros for code deduplication and testing. The author highlights the use of macros to define structures and implement traits, which reduces code duplication. Additionally, the post details the author's approach to testing, including the creation of table-driven tests similar to those in Go, to ensure the functionality of the lexer and parser. The author expresses enthusiasm for Rust's capabilities in handling these tasks and plans to extend the project by developing a Language Server Protocol (LSP) server for SQL.

- The author is developing a static analysis tool for SQL called sqleibniz using Rust.

- The tool aims to perform syntax checks and validate SQL input against SQLite standards.

- Rust's macros are utilized for code deduplication and defining structures efficiently.

- The author emphasizes the importance of high-quality error messages in the tool's output.

- Testing is conducted using table-driven tests to ensure the functionality of the lexer and parser.

Link Icon 2 comments
By @Jtsummers - 6 months
https://news.ycombinator.com/item?id=42083547 - Big discussion from a couple days ago
By @TechWhiz42 - 6 months
rust is great for tokenising