Interact with LLMs with a simple DuckDB Extension
Installing and Loading
INSTALL open_prompt FROM community;
LOAD open_prompt;
Example
-- Configure the required parameters to access OpenAI Completions compatible APIs
D CREATE SECRET IF NOT EXISTS open_prompt (
TYPE open_prompt,
PROVIDER config,
api_token 'your-api-token',
api_url 'http://localhost:11434/v1/chat/completions',
model_name 'qwen2.5:0.5b',
api_timeout '30'
);
-- Prompt any OpenAI Completions API form your query
D SELECT open_prompt('Write a one-line poem about ducks') AS response;
┌────────────────────────────────────────────────┐
│ response │
│ varchar │
├────────────────────────────────────────────────┤
│ Ducks quacking at dawn, swimming in the light. │
└────────────────────────────────────────────────┘
-- Prompt requesting JSON Structured Output for ChatGPT, LLama3, etc
SET VARIABLE openprompt_model_name = 'llama3.2:3b';
SELECT open_prompt('I want ice cream', json_schema := '{
"type": "object",
"properties": {
"summary": { "type": "string" },
"sentiment": { "type": "string", "enum": ["pos", "neg", "neutral"] }
},
"required": ["summary", "sentiment"],
"additionalProperties": false
}');
-- Use Custom System Prompt to request JSON Output in smaller models
SET VARIABLE openprompt_model_name = 'qwen2.5:1.5b';
SELECT open_prompt('I want ice cream.', system_prompt:='Response MUST be JSON with the following schema: {
"type": "object",
"properties": {
"summary": { "type": "string" },
"sentiment": { "type": "string", "enum": ["pos", "neg", "neutral"] }
},
"required": ["summary", "sentiment"],
"additionalProperties": false
}');
About open_prompt
Open Prompt Extension
The open_prompt() community extension is shamelessly inspired by the Motherduck prompt() but focused on self-hosted usage.
For examples and instructions check out the
open_prompt()README
Configuration
Setup the completions API URL configuration w/ optional auth token and model name
SET VARIABLE openprompt_api_url = 'http://localhost:11434/v1/chat/completions';
SET VARIABLE openprompt_api_token = 'your_api_key_here';
SET VARIABLE openprompt_model_name = 'qwen2.5:0.5b';
Alternatively the following ENV variables can be used at runtime
OPEN_PROMPT_API_URL='http://localhost:11434/v1/chat/completions'
OPEN_PROMPT_API_TOKEN='your_api_key_here'
OPEN_PROMPT_MODEL_NAME='qwen2.5:0.5b'
OPEN_PROMPT_API_TIMEOUT='30'
For persistent usage, configure parameters using DuckDB SECRETS
CREATE PERSISTENT SECRET IF NOT EXISTS open_prompt (
TYPE open_prompt,
PROVIDER config,
api_token 'your-api-token',
api_url 'http://localhost:11434/v1/chat/completions',
model_name 'qwen2.5:0.5b',
api_timeout '30'
);
Added Functions
| function_name | function_type | description | comment | examples |
|---|---|---|---|---|
| open_prompt | scalar | Send a prompt to an LLM API using a specific model | NULL | [open_prompt('Explain SQL', 'gpt-4')] |
| open_prompt | scalar | Send a prompt to an LLM API with a system prompt and structured output | NULL | [open_prompt('Hello', 'gpt-4', '{}', 'You are helpful')] |
| open_prompt | scalar | Send a prompt to an LLM API with structured JSON output | NULL | [open_prompt('Extract name', 'gpt-4', '{"type":"object"}')] |
| open_prompt | scalar | Send a prompt to an OpenAI-compatible LLM API and return the response | NULL | [open_prompt('What is DuckDB?')] |
| set_api_timeout | scalar | Set the API timeout in seconds for LLM requests | NULL | [set_api_timeout('30')] |
| set_api_token | scalar | Set the API token for LLM authentication | NULL | [set_api_token('sk-…')] |
| set_api_url | scalar | Set the API URL for LLM endpoint | NULL | [set_api_url('https://api.openai.com/v1/chat/completions')] |
| set_model_name | scalar | Set the default model name for LLM requests | NULL | [set_model_name('gpt-4')] |