π Stream-First Architecture
Watch agents
think in real-time
Stream-first architecture where agents yield execution steps for transparent reasoning. See every thought, decision, and action as it happens.
Real-Time Agent Execution
Watch every step unfold as your agent works
from cogency.agent import Agent from cogency.llm import GeminiLLM agent = Agent( name="StreamAgent", llm=GeminiLLM(api_key="your-key") ) # Stream the agent's execution in real-time async for chunk in agent.stream("What is 127 * 43?"): print(chunk)
Live Agent Output
π§ PLAN: I need to calculate 127 * 43. I'll use the calculator tool.
π REASON: This is a simple multiplication that I can solve with
tools.
β‘ ACT: Using calculator tool with 127 * 43...
π REFLECT: Calculator returned 5461. This seems correct.
β¨ RESPOND: The answer is 5461.
Rich Stream Types
Execution Streams
- π§ Planning: Goal decomposition and step planning
- π Reasoning: Analysis and decision-making process
- β‘ Actions: Tool calls and external interactions
- π Reflection: Self-evaluation and error correction
Monitoring Streams
- π Metrics: Performance and usage statistics
- π Traces: Full execution traces with timing
- β οΈ Errors: Exception details and stack traces
- β Events: Lifecycle and state change events
Why Streaming Matters
Full Transparency
See exactly how your agent reasons through problems, making debugging and optimization straightforward.
Instant Feedback
No waiting for final resultsβsee progress immediately and intervene when needed for better user experience.
Production Monitoring
Built-in observability for production deployments with real-time performance metrics and alerting.
Ready for Real-Time Agents?
Experience the transparency and control of streaming AI execution