September 6th, 2024

Build a quick Local code intelligence using Ollama with Rust

Bosun developed Swiftide, a Rust-based tool for efficient code indexing and querying, utilizing Qdrant and FastEmbed. It enhances performance with OpenTelemetry, integrating various language models for improved response times.

Read original articleLink Icon
Build a quick Local code intelligence using Ollama with Rust

The article discusses the development of a local code intelligence tool using Rust, Qdrant, FastEmbed, and OpenTelemetry, as part of Bosun's initiative to reduce technical debt. The tool, named Swiftide, allows for efficient indexing and querying of codebases. It leverages Rust's performance advantages and ensures type safety at compile time. The indexing process involves breaking code into manageable chunks, embedding them, and storing the results in Qdrant. The querying mechanism generates subquestions to enhance the relevance of results, which are then summarized and answered using a language model (LLM). The integration of Ollama as the LLM is highlighted, along with performance comparisons between different models, such as Groq and Llama3.1. The article also emphasizes the use of OpenTelemetry and Jaeger for performance tracing, revealing that while local indexing can be slow, using optimized models can significantly improve response times. Overall, the project demonstrates the potential of Rust for building language tools and the importance of efficient indexing and querying in code intelligence applications.

- Swiftide is an open-source library for indexing and querying codebases using Rust.

- The tool utilizes Qdrant for storage and FastEmbed for embedding code chunks.

- Performance tracing is conducted using OpenTelemetry and Jaeger to analyze the indexing and querying processes.

- The integration of Ollama and Groq as LLMs shows significant differences in performance, with Groq being notably faster.

- The project aims to enhance the development experience by providing efficient local code intelligence tools.

Link Icon 2 comments
By @joshka - 4 months
Do you have token counts / cost breakdown for using groq models for this?

Something I'd really love to see as an open source library maintainer is something of an amalgam of:

- current source

- git commit history plus historical source

- github issues, PRs, discussions

- forum posts / discord discussions

- website docs, docs.rs docs

And to be able to use all that to work on support requests / code gen / feature implementation / spec generation etc.