Launch HN: Promptless (YC W25) – Automatic updates for customer-facing docs
Promptless is an AI tool that automates documentation updates during software development, integrating with platforms like Jira and Slack, and is currently offering a free trial for users.
Prithvi and Frances have developed Promptless, an AI tool designed to automatically update documentation while software is being developed. The tool addresses the common issue of outdated documentation, which can hinder both developers and users. Promptless can be triggered by new pull requests (PRs), support tickets, or manual commands in Slack, and integrates with platforms like Jira, Linear, and Notion to understand the context of changes. Upon initial connection, it reviews existing documentation to create a "product ontology," which helps it draft relevant updates when triggered. Users have found various applications for Promptless beyond just PR updates, such as facilitating documentation updates from Slack channels and synchronizing content between product and content teams. The tool has also been beneficial for open-source projects, where a significant portion of commits involve documentation changes. Promptless is currently offering a free trial for users from Hacker News, encouraging feedback and suggestions for improvement.
- Promptless is an AI tool that automates documentation updates during software development.
- It integrates with tools like Jira and Slack to trigger updates based on specific events.
- The tool creates a "product ontology" to understand documentation context and relationships.
- Users have found diverse applications for Promptless, enhancing collaboration between teams.
- A free trial is available for users interested in exploring its features.
Related
Prompt Caching with Claude
Anthropic has launched prompt caching for its Claude API, enhancing performance by reducing costs and latency significantly. Currently in beta for Claude 3.5 and 3 Haiku, with Opus support upcoming.
Claude just slashed the cost of building AI applications
ClaudeAI's new Prompt Caching feature allows developers to reuse text, potentially reducing input API costs by up to 90%, benefiting applications like AI assistants and prompting competitors to consider similar innovations.
Show HN: PromptMage – Simplify and Manage Your LLM Workflows
PromptMage is an alpha-stage application for managing workflows with large language models, featuring an intuitive interface, version control, testing support, and encouraging community contributions for ongoing development.
Show HN: Relari – Auto Prompt Optimizer as Lightweight Alternative to Finetuning
Relari has introduced an Auto Prompt Optimizer to improve language model performance for specific tasks, offering transparency and ease of use, with future features planned and user feedback encouraged.
The Prompt() Function: Use the Power of LLMs with SQL
MotherDuck's prompt() function integrates small language models into SQL, enabling efficient bulk text summarization and structured data extraction, significantly reducing processing times and allowing customizable output formats.
- Many users express excitement about the potential of automating documentation updates, highlighting its usefulness in software development.
- Several commenters request clarity on pricing and express frustration with the need for a call to access the tool.
- There are suggestions for additional features, such as integration with platforms like Discord and generating documentation from customer support interactions.
- Some users share personal experiences with documentation challenges, emphasizing the importance of accurate and timely updates.
- Concerns about data privacy and the handling of sensitive information in auto-generated documents are raised.
However, you all should publish pricing before launching, and forcing me to book a call with you to use it is a nonstarter for me.
I don't want to get tied to this tool and then be charged for it in some weird way; give me your v0 pricing so that I can pay for it in a transparent way. As a fellow founder, I think you also know how little time I have for calls to check out demos for tools. So, just let me sign up and give it a spin.
They may be rare, but this is not universally true! I have a staff developer who creates beautiful documentation, paired with hand drawn (tablet) diagrams. I never miss an opportunity to complement & thank him for his work, he really seems to enjoy it and it goes well with the role's mandate to level-up other developers. If you find a developer (especially a senior+) who likes to create and maintain documentation, treat them like gold!
Another pain point was creating guides/examples for integrating 3rd party tools. Could be worth exploring
Can you clarify the slack integration? I see you've mentioned it, but I'm unclear on the workflow.
A use-case I've thought about previously is support discussions we have on our slack channel, which indicate that we need to update our docs.
Is Promptless able to raise a PR with suggested updates on docs, based on question/answers in our support channels?
I am curious how you prevent private data from getting leaked to the auto-generated public docs. I imagine this problem does not exist in open source projects, but would become an issue if not everything discussed in company's private messenger should be used as context for generating docs.
That could be tied up with product walkthroughs so applications would give you a demo on its own and would be always up to date.
I must be weird ;)
I would be OK with AI doing it as long as it is more like dependabot and asks for a PR like everyone else.
Untruths is the biggest issue but then docs mislead anyway due to getting out of date.
* This could revolutionize how we manage information.
* Think of the time saved, no more manual updates.
* Customer satisfaction would likely increase.
* Accuracy in documentation becomes a focus.
Could this be the end of outdated manuals?
Related
Prompt Caching with Claude
Anthropic has launched prompt caching for its Claude API, enhancing performance by reducing costs and latency significantly. Currently in beta for Claude 3.5 and 3 Haiku, with Opus support upcoming.
Claude just slashed the cost of building AI applications
ClaudeAI's new Prompt Caching feature allows developers to reuse text, potentially reducing input API costs by up to 90%, benefiting applications like AI assistants and prompting competitors to consider similar innovations.
Show HN: PromptMage – Simplify and Manage Your LLM Workflows
PromptMage is an alpha-stage application for managing workflows with large language models, featuring an intuitive interface, version control, testing support, and encouraging community contributions for ongoing development.
Show HN: Relari – Auto Prompt Optimizer as Lightweight Alternative to Finetuning
Relari has introduced an Auto Prompt Optimizer to improve language model performance for specific tasks, offering transparency and ease of use, with future features planned and user feedback encouraged.
The Prompt() Function: Use the Power of LLMs with SQL
MotherDuck's prompt() function integrates small language models into SQL, enabling efficient bulk text summarization and structured data extraction, significantly reducing processing times and allowing customizable output formats.