Skip to main content
The SDK provides events for actions that interact with the Smallest platform—ending calls, transferring to humans, and more.

End Call

End the current call when the user is done:
from smallestai.atoms.agent.events import SDKAgentEndCallEvent
from smallestai.atoms.agent.tools import function_tool

class MyAgent(OutputAgentNode):
    @function_tool()
    async def end_call(self):
        """End the call when the user says goodbye or is done."""
        await self.send_event(SDKAgentEndCallEvent())
        return None
The LLM will call this when the user says things like “thanks, bye” or “that’s all I needed.”

Transfer Call

Transfer to a human agent or another phone number:
from smallestai.atoms.agent.events import (
    SDKAgentTransferConversationEvent,
    TransferOption,
    TransferOptionType
)

class MyAgent(OutputAgentNode):
    @function_tool()
    async def transfer_to_human(self, reason: str):
        """Transfer the call to a human agent.
        
        Args:
            reason: Why the transfer is needed
        """
        await self.send_event(SDKAgentTransferConversationEvent(
            transfer_call_number="+1234567890",
            transfer_options=TransferOption(
                type=TransferOptionType.COLD_TRANSFER
            ),
            on_hold_music="ringtone"
        ))
        return {"status": "transferring", "reason": reason}

Transfer Options

TypeBehavior
COLD_TRANSFERImmediately connect to the new number
WARM_TRANSFERAgent stays on while connecting

Hold Music

ValueSound
"ringtone"Standard ring tone
"relaxing_sound"Calm background music
"uplifting_beats"Upbeat hold music
"none"Silence

Example: Complete Agent

import os
from smallestai.atoms.agent.nodes import OutputAgentNode
from smallestai.atoms.agent.clients.openai import OpenAIClient
from smallestai.atoms.agent.tools import ToolRegistry, function_tool
from smallestai.atoms.agent.events import SDKAgentEndCallEvent

class SupportAgent(OutputAgentNode):
    def __init__(self):
        super().__init__(name="support-agent")
        self.llm = OpenAIClient(model="gpt-4o-mini")
        
        self.tool_registry = ToolRegistry()
        self.tool_registry.discover(self)
        
        self.context.add_message({
            "role": "system",
            "content": "You are a support agent. Help users with their questions. "
                      "Use end_call when they're done."
        })
    
    @function_tool()
    def get_order_status(self, order_id: str):
        """Look up an order's status.
        
        Args:
            order_id: Order ID like "ORD-12345"
        """
        # Your implementation
        return {"status": "shipped", "eta": "Tomorrow"}
    
    @function_tool()
    async def end_call(self):
        """End the call when the user says goodbye."""
        await self.send_event(SDKAgentEndCallEvent())
        return None
    
    async def generate_response(self):
        response = await self.llm.chat(
            messages=self.context.messages,
            stream=True,
            tools=self.tool_registry.get_schemas()
        )
        
        tool_calls = []
        async for chunk in response:
            if chunk.content:
                yield chunk.content
            if chunk.tool_calls:
                tool_calls.extend(chunk.tool_calls)
        
        if tool_calls:
            results = await self.tool_registry.execute(tool_calls=tool_calls, parallel=True)
            
            self.context.add_messages([
                {
                    "role": "assistant",
                    "content": "",
                    "tool_calls": [
                        {"id": tc.id, "type": "function", "function": {"name": tc.name, "arguments": str(tc.arguments)}}
                        for tc in tool_calls
                    ]
                },
                *[{"role": "tool", "tool_call_id": tc.id, "content": str(result)}
                  for tc, result in zip(tool_calls, results)]
            ])
            
            final = await self.llm.chat(messages=self.context.messages, stream=True)
            async for chunk in final:
                if chunk.content:
                    yield chunk.content

Tips

Tell the LLM when to end: “Call end_call when the user says goodbye or thanks.”
Even if the action just sends an event, return something (even None) so the LLM knows it succeeded.
Track transfer reasons to identify patterns and improve your agent.