Quickstart

Get MindReef integrated into your AI agent application in under 5 minutes. By the end of this guide, you'll have full observability into your agent's behavior.

1Install the SDK

Install the MindReef Python SDK using pip:

pip install mindreef

2Configure Your API Key

Set your MindReef API key as an environment variable:

export MINDREEF_API_KEY=mr_live_your_api_key_here

You can find your API key in the MindReef dashboard under Settings → API Keys.

3Instrument Your Agent

Add the @trace decorator to your agent function and enable auto-instrumentation for your LLM provider:

from mindreef import trace, patch_openai
import openai

# Enable auto-instrumentation for OpenAI
patch_openai()

@trace
async def research_agent(query: str) -> str:
    """An agent that researches topics and provides summaries."""

    # This LLM call is automatically captured
    response = await openai.chat.completions.create(
        model="gpt-4o",
        messages=[
            {"role": "system", "content": "You are a research assistant."},
            {"role": "user", "content": query}
        ]
    )

    return response.choices[0].message.content

# Run your agent
result = await research_agent("What is quantum computing?")

4View Your Traces

Open the MindReef dashboard to see your traces. Each agent execution creates a trace that shows:

Tip: The SDK batches trace data and sends it asynchronously, so it won't slow down your agent execution.

Next Steps

Now that you have basic tracing working, explore these topics: