A Simple Postgres Logger for OpenAI Endpoints
Open Source Python Package
Being able to log LLM traces (i.e. prompts and responses) underpins:
Evaluating the performance of LLM products
Creating synthetic data to fine-tune and improve LLMs
There are many tools that can be used for logging (Braintrust, Humanloop, Langfuse etc.), but here’s a really simple one:
pip install trelis-openai-logger
from trelis-openai-logger import OpenAI
create the same OpenAI client you always use, but pass in a postgres URL.
It’s a two line swap and all of your queries are logged to a Postgres database that you can easily run locally or run cheaply in the cloud (e.g. a Digital Ocean Droplet).
Cheers, Ronan
P.S. 🛠️ (NEW) Trelis Fine-tuning and Evals Seminars - learn more here.
Trelis Links:
🤝 Are you a talented developer? Work for Trelis
💡 Need Technical or Market Assistance? Book a Consult Here
💸 Starting a New Project/Venture? Apply for a Trelis Grant
Setting Up LLM Response Logging with Trelis OpenAI Logger
The trelis-openai-ai logger library provides a straightforward way to log Large Language Model (LLM) interactions to a Postgres database. This tool helps track prompts, responses, latency, and token usage for production deployments.
Core Functionality
Logs all prompts and responses to Postgres
Records model name, latency, token counts, and metadata
Works with OpenAI-style API endpoints
Supports both local and remote database deployments
Installation & Basic Usage
from Trelis_openai_logger import OpenAI
client = OpenAI(
postgres_url="postgresql://localhost:5432/llm_logs",
api_key="your-api-key" # Optional
)Database Setup Options
1. Local Postgres
Install Postgres (
brew install postgresql@15on Mac)Create database:
createdb llm_logsRun migrations using dbmate
Connection string:
postgresql://localhost:5432/llm_logs
2. DigitalOcean Droplet
Minimum 2GB RAM recommended
Uses cloud-init for automated setup
Runs Postgres directly on virtual machine
Approximate cost: $12/month
3. DigitalOcean Managed Database ~$15/month
Fully managed Postgres instance
Built-in backups and monitoring
Higher reliability for production use
Connection string provided by DigitalOcean
Data Structure
The logger stores:
Model name
Endpoint used
Input messages
Raw response
Latency
Token counts (prompt, completion, total)
Additional metadata
Use Cases
Performance monitoring
Quality evaluation
Data collection for fine-tuning
Usage tracking and optimization
The logger is available as an open-source package on GitHub and PyPI, installable via pip install trelis-openai-logger .

