Search Shortcut cmd + k | ctrl + k
open_prompt

Interact with LLMs with a simple DuckDB Extension

Maintainer(s): lmangani, akvlad

Installing and Loading

INSTALL open_prompt FROM community;
LOAD open_prompt;

Example

-- Configure the required parameters to access OpenAI Completions compatible APIs
D CREATE SECRET IF NOT EXISTS open_prompt (
      TYPE open_prompt,
      PROVIDER config,
      api_token 'your-api-token',
      api_url 'http://localhost:11434/v1/chat/completions',
      model_name 'qwen2.5:0.5b',
      api_timeout '30'
  );

-- Prompt any OpenAI Completions API form your query
D SELECT open_prompt('Write a one-line poem about ducks') AS response;
┌────────────────────────────────────────────────┐
                    response                    
                    varchar                     
├────────────────────────────────────────────────┤
 Ducks quacking at dawn, swimming in the light. 
└────────────────────────────────────────────────┘

-- Prompt requesting JSON Structured Output for ChatGPT, LLama3, etc
SET VARIABLE openprompt_model_name = 'llama3.2:3b';
SELECT open_prompt('I want ice cream', json_schema := '{
   "type": "object",
   "properties": {
     "summary": { "type": "string" },
     "sentiment": { "type": "string", "enum": ["pos", "neg", "neutral"] }
   },
   "required": ["summary", "sentiment"],
   "additionalProperties": false
 }');

-- Use Custom System Prompt to request JSON Output in smaller models
SET VARIABLE openprompt_model_name = 'qwen2.5:1.5b';
SELECT open_prompt('I want ice cream.', system_prompt:='Response MUST be JSON with the following schema: {
       "type": "object",
       "properties": {
         "summary": { "type": "string" },
         "sentiment": { "type": "string", "enum": ["pos", "neg", "neutral"] }
       },
       "required": ["summary", "sentiment"],
       "additionalProperties": false
     }');

About open_prompt

Open Prompt Extension

The open_prompt() community extension is shamelessly inspired by the Motherduck prompt() but focused on self-hosted usage.

For examples and instructions check out the open_prompt() README

Configuration

Setup the completions API URL configuration w/ optional auth token and model name

SET VARIABLE openprompt_api_url = 'http://localhost:11434/v1/chat/completions';
SET VARIABLE openprompt_api_token = 'your_api_key_here';
SET VARIABLE openprompt_model_name = 'qwen2.5:0.5b';

Alternatively the following ENV variables can be used at runtime

OPEN_PROMPT_API_URL='http://localhost:11434/v1/chat/completions'
OPEN_PROMPT_API_TOKEN='your_api_key_here'
OPEN_PROMPT_MODEL_NAME='qwen2.5:0.5b'
OPEN_PROMPT_API_TIMEOUT='30'

For persistent usage, configure parameters using DuckDB SECRETS

CREATE PERSISTENT SECRET IF NOT EXISTS open_prompt (
      TYPE open_prompt,
      PROVIDER config,
      api_token 'your-api-token',
      api_url 'http://localhost:11434/v1/chat/completions',
      model_name 'qwen2.5:0.5b',
      api_timeout '30'
  );

Added Functions

function_name function_type description comment example
open_prompt scalar      
set_api_timeout scalar      
set_api_token scalar      
set_api_url scalar      
set_model_name scalar