Search Shortcut cmd + k | ctrl + k
open_prompt

Interact with LLMs with a simple DuckDB Extension

Installing and Loading

INSTALL open_prompt FROM community;
LOAD open_prompt;

Example

-- Configure the required extension parameters
SET VARIABLE openprompt_api_url = 'http://localhost:11434/v1/chat/completions';
SET VARIABLE openprompt_api_token = 'optional_api_token_here';
SET VARIABLE openprompt_model_name = 'qwen2.5:0.5b';

-- Prompt any OpenAI Completions API form your query
D SELECT open_prompt('Write a one-line poem about ducks') AS response;
┌────────────────────────────────────────────────┐
                    response                    
                    varchar                     
├────────────────────────────────────────────────┤
 Ducks quacking at dawn, swimming in the light. 
└────────────────────────────────────────────────┘

-- Prompt requesting JSON Structured Output for ChatGPT, LLama3, etc
SET VARIABLE openprompt_model_name = 'llama3.2:3b';
SELECT open_prompt('I want ice cream', json_schema := '{
   "type": "object",
   "properties": {
     "summary": { "type": "string" },
     "sentiment": { "type": "string", "enum": ["pos", "neg", "neutral"] }
   },
   "required": ["summary", "sentiment"],
   "additionalProperties": false
 }');

-- Use Custom System Prompt to request JSON Output in smaller models
SET VARIABLE openprompt_model_name = 'qwen2.5:1.5b';
SELECT open_prompt('I want ice cream.', system_prompt:='Response MUST be JSON with the following schema: {
       "type": "object",
       "properties": {
         "summary": { "type": "string" },
         "sentiment": { "type": "string", "enum": ["pos", "neg", "neutral"] }
       },
       "required": ["summary", "sentiment"],
       "additionalProperties": false
     }');

About open_prompt

For examples and instructions check out the open_prompt README

Added Functions

function_name function_type description comment example
open_prompt scalar      
set_api_token scalar      
set_api_url scalar      
set_model_name scalar