Show HN: Llm2sh – Translate plain-language requests into shell commands
The `llm2sh` utility translates plain language into shell commands using LLMs like OpenAI and Claude. It offers customization, YOLO mode, and extensibility. Installation via `pip` is simple. User privacy is prioritized. Contributions to the GPLv3-licensed project are welcome. Users should review commands before execution. Visit the GitHub repository for details.
Read original articleThe `llm2sh` command-line utility translates plain language requests into shell commands using various Large Language Models (LLMs) like OpenAI and Claude. It offers features such as customizable configuration, YOLO mode for quick command execution, and extensibility for new LLMs and prompts. Installation is simple with `pip install llm2sh`, and commands can be run with `llm2sh [options] <request>`. The tool emphasizes user privacy by not storing data, although LLM APIs may retain requests. Contributions to the GPLv3-licensed project are encouraged on the GitHub repository. `llm2sh` is experimental, and users are advised to review generated commands before execution. More information is available on the GitHub repository for `llm2sh`.
- Some users find the tool promising but question its efficiency compared to traditional command syntax.
- There are suggestions for sandboxing and making the tool more secure, such as using Docker containers.
- Several users appreciate the GPLv3 license, easy installation via `pip`, and good documentation.
- Questions arise about the tool's ability to handle specific commands and its comparison to other similar tools.
- Some users express interest in extending the tool to support custom or local APIs like llama.cpp.
+ GPLv3
+ Defaults to listing commands and asking for confirmation
+ Install is just "pip install"
+ Good docs with examples
Is there a way to point at an arbitrary API endpoint? IIRC llama.cpp can do an OpenAPI compatible API so it should be drop in?
I created something similar using blade a while back, but I found that using English to express what I want was actually really inefficient. It turns out that for most commands, the command syntax is already a pretty expressive format.
So nowadays I'm back to using a chat UI (Claude) for the scenarios where I need help figuring out the right command. Being able to iterate is essential in those scenarios.