cogency.agent

Classes

Agent

Magical 6-line DX that just works. Args: name: Agent identifier llm: Language model instance tools: Optional list of tools for agent to use trace: Enable execution tracing for debugging (default: True)


Agent(
  self,
  name: str,
  llm: Optional[cogency.llm.base.BaseLLM] = None,
  tools: Optional[List[cogency.tools.base.BaseTool]] = None,
  trace: bool = True,
  memory_dir: str = '.memory',
  prompt_fragments: Optional[Dict[str, Dict[str, str]]] = None,
  default_output_mode: Literal['summary', 'trace', 'dev', 'explain']
    = 'summary'
)
                    

AgentState

dict() -> new empty dictionary dict(mapping) -> new dictionary initialized from a mapping object's (key, value) pairs dict(iterable) -> new dictionary initialized as if via: d = {} for k, v in iterable: d[k] = v dict(**kwargs) -> new dictionary initialized with the name=value pairs in the keyword argument list. For example: dict(one=1, two=2)


AgentState(self, /, *args, **kwargs)
                    

BaseLLM

Base class for all LLM implementations in the cogency framework. All LLM providers support: - Streaming execution for real-time output - Key rotation for high-volume usage - Rate limiting via yield_interval parameter - Unified interface across providers - Dynamic model/parameter configuration


BaseLLM(self, api_key: str = None, key_rotator=None, **kwargs)
                    

BaseTool

Base class for all tools in the cogency framework.


BaseTool(self, name: str, description: str)
                    

CircuitOpenError

Circuit breaker is open.


CircuitOpenError(self, /, *args, **kwargs)
                    

Context

Agent operational context.


Context(
  self,
  current_input: str,
  messages: List[Dict[str, str]] = None,
  tool_results: Optional[List[Dict[str, Any]]] = None,
  max_history: Optional[int] = None
)
                    

ExecutionTrace

Lean trace engine - just stores entries with serialization safety.


ExecutionTrace(self)
                    

FSMemory

Filesystem-based memory backend. Stores memory artifacts as JSON files in a directory structure. Uses simple text matching for recall operations.


FSMemory(self, memory_dir: str = '.memory')
                    

RateLimitedError

Request was rate limited.


RateLimitedError(self, /, *args, **kwargs)
                    

ToolRegistry

Auto-discovery registry for tools.


ToolRegistry(self, /, *args, **kwargs)
                    

Tracer

Handles formatting and output of execution traces.


Tracer(self, trace: cogency.types.ExecutionTrace)
                    

Workflow

Abstracts LangGraph complexity for magical Agent DX.


Workflow(
  self,
  llm,
  tools,
  memory: cogency.memory.base.BaseMemory,
  routing_table: Optional[Dict] = None,
  prompt_fragments: Optional[Dict[str, Dict[str, str]]] = None
)
                    

Functions

auto_detect_llm

Auto-detect LLM provider from environment variables. Fallback chain: 1. OpenAI 2. Anthropic 3. Gemini 4. Grok 5. Mistral Returns: BaseLLM: Configured LLM instance Raises: RuntimeError: If no API keys found for any provider.


auto_detect_llm() -> cogency.llm.base.BaseLLM
                    

counter

Record counter metric.


counter(name: str, value: float = 1.0, tags: Optional[Dict[str, str]] = None)
                    

get_metrics

Get global metrics collector.


get_metrics() -> cogency.core.metrics.MetricsCollector
                    

histogram

Record histogram metric.


histogram(name: str, value: float, tags: Optional[Dict[str, str]] = None)
                    

with_metrics

Decorator to automatically time function execution.


with_metrics(metric_name: str, tags: Optional[Dict[str, str]] = None)