July 28th, 2024

Show HN: Desktop Devin

CodeCompanion.AI is a free, open-source desktop application for developers, offering features like semantic code search, an interactive chat interface, and local data storage for privacy and security.

Read original articleLink Icon
Show HN: Desktop Devin

CodeCompanion.AI is a free, open-source, privacy-focused desktop application designed to assist developers with coding tasks. It features a one-click installation process and is compatible with both Mac and Windows operating systems. The application supports a variety of functionalities, including semantic code search, an interactive chat interface, and comprehensive file system operations. Users can choose to operate CodeCompanion autonomously or collaboratively, allowing for code review and feedback at any stage.

Key features include an integrated browser with developer tools, an integrated terminal, and the ability to preview and approve code changes. The app is designed to work with any size codebase and offers unlimited context windows and dynamic context management to optimize token usage. Importantly, all user data is stored locally, ensuring privacy and security, with no code sent to external servers except for necessary API calls.

CodeCompanion.AI can handle a wide range of coding tasks, such as creating projects in various frameworks (like Rails, Django, and Express), managing database migrations, deploying applications, and configuring CI/CD pipelines. It also provides tools for automated testing, vulnerability checks, and generating documentation. The application aims to enhance productivity for developers by automating repetitive tasks and providing intelligent coding assistance. Users can stay updated on new features and releases by signing up for the newsletter.

Link Icon 2 comments
By @printvoid - 3 months
Does it always need an API key to work with any models, why can't you provide GPT3.5 which is free or any other models available that work without an API key.
By @navjack27 - 3 months
Yeah I'm not sure how this is privacy focused if it's still using API. I have local models that could run on my CPU that have pretty big context windows that would be pretty cool if this could just plug into and keep everything local.