Skip to main content
The OpenAI Agents SDK is OpenAI’s official Python SDK for building agentic applications. Superserve provides a production-ready deployment platform with isolation, persistence, and governance.

Quick Start

1

Install the CLI

curl -fsSL https://superserve.ai/install | sh
2

Create your agent

Create a file called agent.py with your OpenAI Agent:
agent.py
"""
Minimal chatbot built with OpenAI Agents SDK deployed on Superserve.
"""

from agents import Agent, Runner

agent = Agent(
    name="assistant",
    instructions="You are a helpful assistant.",
)

while True:
    try:
        user_input = input()
    except EOFError:
        break
    result = Runner.run_sync(agent, user_input)
    print(result.final_output)
3

Deploy your agent

Log in and deploy your agent:
superserve login
superserve deploy agent.py --name chatbot
4

Set your API key

Configure your OpenAI API key as a secret:
superserve secrets set chatbot OPENAI_API_KEY=sk-...
Secrets are encrypted at rest and injected at the network level. The agent never sees them in logs or LLM context.
5

Run your agent

Start an interactive session:
superserve run chatbot
You > What is the capital of France?

Agent > The capital of France is Paris.

Completed in 1.2s

Configuration Options

The OpenAI Agents SDK supports various configuration options when creating an Agent:
from agents import Agent

agent = Agent(
    name="assistant",
    instructions="You are a helpful assistant.",
    model="gpt-4o",              # Model to use
    temperature=0.7,              # Sampling temperature
    max_tokens=4096,              # Maximum tokens per response
)

Model Selection

  • gpt-4o - GPT-4 Omni (recommended)
  • gpt-4-turbo - GPT-4 Turbo
  • gpt-3.5-turbo - GPT-3.5 Turbo (faster, cheaper)

Adding Tools

The OpenAI Agents SDK supports function calling for tools. Here’s an example with a custom tool:
agent.py
from agents import Agent, Runner, tool

@tool
def get_weather(location: str) -> str:
    """Get the current weather for a location.
    
    Args:
        location: The city name
    
    Returns:
        Weather description
    """
    # Your weather API logic here
    return f"The weather in {location} is sunny."

agent = Agent(
    name="weather-assistant",
    instructions="You are a helpful weather assistant.",
    tools=[get_weather],
)

while True:
    try:
        user_input = input()
    except EOFError:
        break
    result = Runner.run_sync(agent, user_input)
    print(result.final_output)

Async Support

For better performance with I/O operations, use the async runner:
agent.py
import asyncio
from agents import Agent, Runner

agent = Agent(
    name="assistant",
    instructions="You are a helpful assistant.",
)

async def main():
    while True:
        try:
            user_input = input()
        except EOFError:
            break
        result = await Runner.run(agent, user_input)
        print(result.final_output)

asyncio.run(main())

Deployment Configuration

Create a superserve.yaml file for advanced deployment options:
superserve.yaml
name: chatbot
command: python agent.py
secrets:
  - OPENAI_API_KEY
ignore:
  - "*.pyc"
  - __pycache__
  - .git
Then deploy with:
superserve deploy

Dependencies

Create a requirements.txt with your dependencies:
requirements.txt
openai-agents
requests
python-dotenv
Or use pyproject.toml:
pyproject.toml
[project]
name = "my-chatbot"
version = "0.1.0"
dependencies = [
    "openai-agents",
    "requests",
    "python-dotenv",
]
Superserve automatically installs dependencies during deployment.

Session Persistence

The /workspace directory persists across turns and restarts. Here’s an example that saves conversation history:
agent.py
import json
from pathlib import Path
from agents import Agent, Runner

# Persistent storage
WORKSPACE = Path("/workspace")
HISTORY_FILE = WORKSPACE / "conversation_history.json"

def save_message(role: str, content: str):
    """Save message to persistent storage."""
    history = []
    if HISTORY_FILE.exists():
        history = json.loads(HISTORY_FILE.read_text())
    history.append({"role": role, "content": content})
    HISTORY_FILE.write_text(json.dumps(history, indent=2))

agent = Agent(
    name="assistant",
    instructions="You are a helpful assistant with memory.",
)

# Load conversation history
if HISTORY_FILE.exists():
    print("Resuming previous conversation...")

while True:
    try:
        user_input = input()
    except EOFError:
        break
    
    save_message("user", user_input)
    result = Runner.run_sync(agent, user_input)
    print(result.final_output)
    save_message("assistant", result.final_output)

Multi-Agent Systems

Build multi-agent systems with the OpenAI Agents SDK:
agent.py
from agents import Agent, Runner

researcher = Agent(
    name="researcher",
    instructions="You research topics and provide detailed information.",
)

writer = Agent(
    name="writer",
    instructions="You write clear, engaging content based on research.",
)

while True:
    try:
        user_input = input()
    except EOFError:
        break
    
    # Research phase
    research_result = Runner.run_sync(researcher, user_input)
    
    # Writing phase
    write_prompt = f"Based on this research: {research_result.final_output}\n\nWrite a summary."
    write_result = Runner.run_sync(writer, write_prompt)
    
    print(write_result.final_output)

Troubleshooting

Make sure you have a requirements.txt or pyproject.toml with openai-agents listed. Redeploy your agent:
superserve deploy agent.py --name chatbot
Set your OpenAI API key as a secret:
superserve secrets set chatbot OPENAI_API_KEY=sk-...
Check the agent logs:
superserve sessions list
superserve sessions logs <session-id>

Next Steps

Core Concepts

Learn about isolation, persistence, and credentials

CLI Reference

Explore deployment options and CLI commands

Secrets Management

Manage API keys and environment variables

Session Management

Work with persistent sessions

Build docs developers (and LLMs) love