dllmforge.agent_core

Simple agent core for DLLMForge - Clean LangGraph utilities.

This module provides simple, elegant utilities for creating LangGraph agents following the pattern established in water_management_agent_simple.py.

Functions

create_basic_agent([system_message, ...])

Create a basic agent with standard setup.

create_basic_tools()

Create basic utility tools for testing.

create_echo_tool()

Create a simple echo tool for testing.

tool(func)

DLLMForge wrapper around LangChain's @tool decorator.

Classes

SimpleAgent([system_message, temperature, ...])

Simple agent class for LangGraph workflows.

dllmforge.agent_core.tool(func)[source]

DLLMForge wrapper around LangChain’s @tool decorator.

This decorator provides a consistent interface for creating tools within the DLLMForge ecosystem while maintaining compatibility with LangChain’s tool system.

Parameters:

func – Function to be converted into a tool

Returns:

Tool function that can be used with SimpleAgent

class dllmforge.agent_core.SimpleAgent(system_message: str = None, temperature: float = 0.1, model_provider: str = 'azure-openai', llm=None, enable_text_tool_routing: bool = False, max_tool_iterations: int = 3)[source]

Simple agent class for LangGraph workflows.

Initialize a simple LangGraph agent.

Parameters:
  • system_message – System message for the agent

  • temperature – LLM temperature setting

  • model_provider – LLM provider (“azure-openai”, “openai”, “mistral”)

__init__(system_message: str = None, temperature: float = 0.1, model_provider: str = 'azure-openai', llm=None, enable_text_tool_routing: bool = False, max_tool_iterations: int = 3)[source]

Initialize a simple LangGraph agent.

Parameters:
  • system_message – System message for the agent

  • temperature – LLM temperature setting

  • model_provider – LLM provider (“azure-openai”, “openai”, “mistral”)

add_tool(tool_func: Callable) None[source]

Add a tool to the agent.

Parameters:

tool_func – Function decorated with @tool

add_node(name: str, func: Callable) None[source]

Add a node to the workflow.

Parameters:
  • name – Node name

  • func – Node function

add_edge(from_node: str, to_node: str) None[source]

Add a simple edge between nodes.

Parameters:
  • from_node – Source node

  • to_node – Target node

add_conditional_edge(from_node: str, condition_func: Callable) None[source]

Add a conditional edge.

Parameters:
  • from_node – Source node

  • condition_func – Function that determines routing

create_simple_workflow() None[source]

Create a simple agent -> tools workflow with optional text-based tool routing.

compile(checkpointer=None) None[source]

Compile the workflow.

process_query(query: str, stream: bool = True) None[source]

Process a query with the agent.

Parameters:
  • query – User query

  • stream – Whether to stream the response

run_interactive() None[source]

Run the agent in interactive mode.

dllmforge.agent_core.create_basic_agent(system_message: str = None, temperature: float = 0.1, model_provider: str = 'azure-openai') SimpleAgent[source]

Create a basic agent with standard setup.

Parameters:
  • system_message – System message for the agent

  • temperature – LLM temperature

  • model_provider – LLM provider (“azure-openai”, “openai”, “mistral”)

Returns:

Configured agent instance

Return type:

SimpleAgent

dllmforge.agent_core.create_echo_tool()[source]

Create a simple echo tool for testing.

dllmforge.agent_core.create_basic_tools() List[Callable][source]

Create basic utility tools for testing.

Returns:

List of tool functions