Skip to content

ProactiveAgent Documentation

Logo
Logo

What is ProactiveAgent?

ProactiveAgent is an open-source Python library that adds time-awareness to your AI agents. Unlike traditional agents that only respond when prompted, ProactiveAgent creates AI agents that can decide when to speak based on intelligent timing and context analysis.

Your agents become truly conversational - they understand when to follow up, when to wait, and when to initiate conversations naturally.

Key Features

  • Intelligent Decision Making - Multi-factor decision engines that determine when to respond
  • Smart Timing - Dynamic sleep calculators that adapt to conversation patterns
  • Fully Customizable - Mix and match decision engines and sleep calculators
  • Production Ready - Robust, tested, and ready for real-world applications
  • Easy Integration - Simple API that works with any LLM provider

How It Works

The 3-Step Decision Cycle: Wake → Decide → Respond → Sleep

Quick Start

Get up and running in minutes:

Installation

Bash
pip install proactiveagent

Basic Usage

Python
import time
from proactiveagent import ProactiveAgent, OpenAIProvider

# Create a proactive agent
agent = ProactiveAgent(
    provider = OpenAIProvider(model="gpt-5-nano",),
    system_prompt = "You are a casual bored teenager. Answer like you're texting a friend",
    decision_config = {
        'wake_up_pattern': "Use the pace of a normal text chat",
    }
)

# Add response callback
agent.add_callback(lambda response: print(f"AI: {response}"))

agent.start()

# Chat with your proactive agent
while True:
    message = input("You: ").strip()
    if message.lower() == 'quit':
        break
    agent.send_message(message)
    time.sleep(1)

agent.stop()

That's it! Your agent now has intelligent timing and will respond naturally based on the conversation flow.

Demo

Advanced Features

  • Runtime Configuration Updates - Modify agent parameters dynamically without restarting
  • Comprehensive Monitoring - Observe decision-making processes and timing patterns through callback mechanisms
  • Context Management System - Programmatically manage conversation state and metadata
  • Provider-Agnostic Architecture - Compatible with multiple LLM providers including OpenAI, Anthropic, and local models
  • Asynchronous Operations - Native async/await support for high-performance concurrent applications

Documentation

Ready to dive deeper? Check out our comprehensive documentation:

Contributing

We welcome contributions! Help us make ProactiveAgent even better:

License

This project is licensed under the BSD 3-Clause License - see the LICENSE file for details.


Made by the community

Star on GitHubReport IssuesRead the Docs

Maintained by Leonardo Mariga