Show HN: Engine Core – open-source LLM chat management and tool call framework
Engine Core is a GitHub repository that enables Large Language Models to use dynamic prompts and tool functions. It supports various LLM integrations and encourages user contributions under the Apache 2.0 License.
Read original articleThe GitHub repository Engine Core provides a framework for enabling Large Language Models (LLMs) to perform tasks using dynamic prompts and a set of tool functions known as chat strategies. The project features three example strategies: demoStrategy, which serves as a basic illustration; backendStrategy, which interacts with a local Fastify application for database migrations and API endpoints; and shellStrategy, which allows the LLM to write files and execute processes. Engine Core supports various LLM integrations, including adapters for models from Anthropic and OpenAI, ensuring that the same application code and strategies can be utilized across different foundation models. To get started, users need to have Docker installed, copy the environment configuration file, and provide an API key for either OpenAI or Anthropic. After running the command line interface, users can select their preferred LLM model and access available commands. Contributions to the project are encouraged, with a recommendation to discuss major changes through an issue before implementation. The project is licensed under the Apache 2.0 License. For further details, users can visit the repository.
Related
It allows us to guardrail and extend LLMs for different software stacks with varying degrees of restriction in a relatively clean and manageable way.
We're interested to see if this framework is useful in other applications or for custom software development configurations.