Conversational State
The OpenAI Responses API (v1/responses) is designed for multi-turn conversations where context needs to persist across requests. Plano provides a unified v1/responses API that works with any LLM provider—OpenAI, Anthropic, Azure OpenAI, DeepSeek, or any OpenAI-compatible provider—while automatically managing conversational state for you.
Unlike the traditional Chat Completions API where you manually manage conversation history by including all previous messages in each request, Plano handles state management behind the scenes. This means you can use the Responses API with any model provider, and Plano will persist conversation context across requests—making it ideal for building conversational agents that remember context without bloating every request with full message history.
How It Works
When a client calls the Responses API:
First request: Plano generates a unique
resp_idand stores the conversation state (messages, model, provider, timestamp).Subsequent requests: The client includes the
previous_resp_idfrom the previous response. Plano retrieves the stored conversation state, merges it with the new input, and sends the combined context to the LLM.Response: The LLM sees the full conversation history without the client needing to resend all previous messages.
This pattern dramatically reduces bandwidth and makes it easier to build multi-turn agents—Plano handles the state plumbing so you can focus on agent logic.
Example Using OpenAI Python SDK:
from openai import OpenAI
# Point to Plano's Model Proxy endpoint
client = OpenAI(
api_key="test-key",
base_url="http://127.0.0.1:12000/v1"
)
# First turn - Plano creates a new conversation state
response = client.responses.create(
model="claude-sonnet-4-5", # Works with any configured provider
input="My name is Alice and I like Python"
)
# Save the response_id for conversation continuity
resp_id = response.id
print(f"Assistant: {response.output_text}")
# Second turn - Plano automatically retrieves previous context
resp2 = client.responses.create(
model="claude-sonnet-4-5", # Make sure its configured in plano_config.yaml
input="Please list all the messages you have received in our conversation, numbering each one.",
previous_response_id=resp_id,
)
print(f"Assistant: {resp2.output_text}")
# Output: "Your name is Alice and your favorite language is Python"
Notice how the second request only includes the new user message—Plano automatically merges it with the stored conversation history before sending to the LLM.
Configuration Overview
State storage is configured in the state_storage section of your plano_config.yaml:
1state_storage:
2 # Type: memory | postgres
3 type: postgres
4
5 # Connection string for postgres type
6 # Environment variables are supported using $VAR_NAME or ${VAR_NAME} syntax
7 # Replace [USER] and [HOST] with your actual database credentials
8 # Variables like $DB_PASSWORD MUST be set before running config validation/rendering
9 # Example: Replace [USER] with 'myuser' and [HOST] with 'db.example.com:5432'
10 connection_string: "postgresql://[USER]:$DB_PASSWORD@[HOST]:5432/postgres"
Plano supports two storage backends:
Memory: Fast, ephemeral storage for development and testing. State is lost when Plano restarts.
PostgreSQL: Durable, production-ready storage with support for Supabase and self-hosted PostgreSQL instances.
Note
If you don’t configure state_storage, conversation state management is disabled. The Responses API will still work, but clients must manually include full conversation history in each request (similar to the Chat Completions API behavior).
Memory Storage (Development)
Memory storage keeps conversation state in-memory using a thread-safe HashMap. It’s perfect for local development, demos, and testing, but all state is lost when Plano restarts.
Configuration
Add this to your plano_config.yaml:
state_storage:
type: memory
That’s it. No additional setup required.
When to Use Memory Storage
Local development and debugging
Demos and proof-of-concepts
Automated testing environments
Single-instance deployments where persistence isn’t critical
Limitations
State is lost on restart
Not suitable for production workloads
Cannot scale across multiple Plano instances
PostgreSQL Storage (Production)
PostgreSQL storage provides durable, production-grade conversation state management. It works with both self-hosted PostgreSQL and Supabase (PostgreSQL-as-a-service), making it ideal for scaling multi-agent systems in production.
Prerequisites
Before configuring PostgreSQL storage, you need:
A PostgreSQL database (version 12 or later)
Database credentials (host, user, password)
The
conversation_statestable created in your database
Setting Up the Database
Run the SQL schema to create the required table:
1-- Conversation State Storage Table
2-- This table stores conversational context for the OpenAI Responses API
3-- Run this SQL against your PostgreSQL/Supabase database before enabling conversation state storage
4
5CREATE TABLE IF NOT EXISTS conversation_states (
6 response_id TEXT PRIMARY KEY,
7 input_items JSONB NOT NULL,
8 created_at BIGINT NOT NULL,
9 model TEXT NOT NULL,
10 provider TEXT NOT NULL,
11 updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
12);
13
14-- Indexes for common query patterns
15CREATE INDEX IF NOT EXISTS idx_conversation_states_created_at
16 ON conversation_states(created_at);
17
18CREATE INDEX IF NOT EXISTS idx_conversation_states_provider
19 ON conversation_states(provider);
20
21-- Optional: Add a policy for automatic cleanup of old conversations
22-- Uncomment and adjust the retention period as needed
23-- CREATE INDEX IF NOT EXISTS idx_conversation_states_updated_at
24-- ON conversation_states(updated_at);
25
26COMMENT ON TABLE conversation_states IS 'Stores conversation history for OpenAI Responses API continuity';
27COMMENT ON COLUMN conversation_states.response_id IS 'Unique identifier for the conversation state';
28COMMENT ON COLUMN conversation_states.input_items IS 'JSONB array of conversation messages and context';
29COMMENT ON COLUMN conversation_states.created_at IS 'Unix timestamp (seconds) when the conversation started';
30COMMENT ON COLUMN conversation_states.model IS 'Model name used for this conversation';
31COMMENT ON COLUMN conversation_states.provider IS 'LLM provider (e.g., openai, anthropic, bedrock)';
Using psql:
psql $DATABASE_URL -f docs/db_setup/conversation_states.sql
Using Supabase Dashboard:
Log in to your Supabase project
Navigate to the SQL Editor
Copy and paste the SQL from
docs/db_setup/conversation_states.sqlRun the query
Configuration
Once the database table is created, configure Plano to use PostgreSQL storage:
state_storage:
type: postgres
connection_string: "postgresql://user:password@host:5432/database"
Using Environment Variables
You should never hardcode credentials. Use environment variables instead:
state_storage:
type: postgres
connection_string: "postgresql://myuser:$DB_PASSWORD@db.example.com:5432/postgres"
Then set the environment variable before running Plano:
export DB_PASSWORD="your-secure-password"
# Run Plano or config validation
./plano
Warning
Special Characters in Passwords: If your password contains special characters like #, @, or &, you must URL-encode them in the connection string. For example, MyPass#123 becomes MyPass%23123.
Supabase Connection Strings
Supabase requires different connection strings depending on your network setup. Most users should use the Session Pooler connection string.
IPv4 Networks (Most Common)
Use the Session Pooler connection string (port 5432):
postgresql://postgres.[PROJECT-REF]:[PASSWORD]@aws-0-[REGION].pooler.supabase.com:5432/postgres
IPv6 Networks
Use the direct connection (port 5432):
postgresql://postgres:[PASSWORD]@db.[PROJECT-REF].supabase.co:5432/postgres
Finding Your Connection String
Go to your Supabase project dashboard
Navigate to Settings → Database → Connection Pooling
Copy the Session mode connection string
Replace
[YOUR-PASSWORD]with your actual database passwordURL-encode special characters in the password
Example Configuration
state_storage:
type: postgres
connection_string: "postgresql://postgres.myproject:$DB_PASSWORD@aws-0-us-west-2.pooler.supabase.com:5432/postgres"
Then set the environment variable:
# If your password is "MyPass#123", encode it as "MyPass%23123"
export DB_PASSWORD="MyPass%23123"
Troubleshooting
“Table ‘conversation_states’ does not exist”
Run the SQL schema from docs/db_setup/conversation_states.sql against your database.
Connection errors with Supabase
Verify you’re using the correct connection string format (Session Pooler for IPv4)
Check that your password is URL-encoded if it contains special characters
Ensure your Supabase project hasn’t paused due to inactivity (free tier)
Permission errors
Ensure your database user has the following permissions:
GRANT SELECT, INSERT, UPDATE, DELETE ON conversation_states TO your_user;
State not persisting across requests
Verify
state_storageis configured in yourplano_config.yamlCheck Plano logs for state storage initialization messages
Ensure the client is sending the
prev_response_id={$response_id}from previous responses
Best Practices
Use environment variables for credentials: Never hardcode database passwords in configuration files.
Start with memory storage for development: Switch to PostgreSQL when moving to production.
Implement cleanup policies: Prevent unbounded growth by regularly archiving or deleting old conversations.
Monitor storage usage: Track conversation state table size and query performance in production.
Test failover scenarios: Ensure your application handles storage backend failures gracefully.
Next Steps
Learn more about building agents that leverage conversational state
Explore filter chains for enriching conversation context
See the LLM Providers guide for configuring model routing