The Prompt() Function: Use the Power of LLMs with SQL
MotherDuck's prompt() function integrates small language models into SQL, enabling efficient bulk text summarization and structured data extraction, significantly reducing processing times and allowing customizable output formats.
Read original articleThe introduction of the prompt() function by MotherDuck allows users to integrate small language models (SLMs) like OpenAI's gpt-4o-mini directly into SQL queries, enhancing data processing capabilities. This function simplifies tasks such as text summarization and structured data extraction without requiring separate infrastructure. The prompt() function can be applied to all rows in a table, enabling bulk operations and significantly reducing processing time compared to traditional methods. For instance, summarizing comments from a dataset can be done in approximately 2.8 seconds for 100 rows, a considerable improvement over Python's sequential processing. The function also supports structured outputs, allowing users to define the format of the returned data, which can be easily integrated into analytical workflows. Users are encouraged to test the function on smaller datasets initially to evaluate its effectiveness and efficiency. The prompt() function is currently available in preview for users on the Free Trial or Standard Plan, with specific quotas on compute usage. MotherDuck invites feedback and experiences from users to further refine this functionality.
- The prompt() function integrates small language models into SQL for enhanced data processing.
- It allows for bulk text summarization and structured data extraction.
- Processing times are significantly reduced compared to traditional methods.
- Users can define output structures for easier integration into workflows.
- The function is available in preview with usage quotas for different plans.
Related
Prompt Caching with Claude
Anthropic has launched prompt caching for its Claude API, enhancing performance by reducing costs and latency significantly. Currently in beta for Claude 3.5 and 3 Haiku, with Opus support upcoming.
pg_duckdb: Splicing Duck and Elephant DNA
MotherDuck launched pg_duckdb, an open-source extension integrating DuckDB with Postgres to enhance analytical capabilities while maintaining transactional efficiency, supported by a consortium of companies and community contributions.
Claude just slashed the cost of building AI applications
ClaudeAI's new Prompt Caching feature allows developers to reuse text, potentially reducing input API costs by up to 90%, benefiting applications like AI assistants and prompting competitors to consider similar innovations.
Show HN: PromptMage – Simplify and Manage Your LLM Workflows
PromptMage is an alpha-stage application for managing workflows with large language models, featuring an intuitive interface, version control, testing support, and encouraging community contributions for ongoing development.
Show HN: Relari – Auto Prompt Optimizer as Lightweight Alternative to Finetuning
Relari has introduced an Auto Prompt Optimizer to improve language model performance for specific tasks, offering transparency and ease of use, with future features planned and user feedback encouraged.
FROM hn.hacker_news
LIMIT 100
"Oops I forgot the limit clause and now owe MotherDuck and OpenAI $93 billion."Related
Prompt Caching with Claude
Anthropic has launched prompt caching for its Claude API, enhancing performance by reducing costs and latency significantly. Currently in beta for Claude 3.5 and 3 Haiku, with Opus support upcoming.
pg_duckdb: Splicing Duck and Elephant DNA
MotherDuck launched pg_duckdb, an open-source extension integrating DuckDB with Postgres to enhance analytical capabilities while maintaining transactional efficiency, supported by a consortium of companies and community contributions.
Claude just slashed the cost of building AI applications
ClaudeAI's new Prompt Caching feature allows developers to reuse text, potentially reducing input API costs by up to 90%, benefiting applications like AI assistants and prompting competitors to consider similar innovations.
Show HN: PromptMage – Simplify and Manage Your LLM Workflows
PromptMage is an alpha-stage application for managing workflows with large language models, featuring an intuitive interface, version control, testing support, and encouraging community contributions for ongoing development.
Show HN: Relari – Auto Prompt Optimizer as Lightweight Alternative to Finetuning
Relari has introduced an Auto Prompt Optimizer to improve language model performance for specific tasks, offering transparency and ease of use, with future features planned and user feedback encouraged.