Agents¶
Agents are the core components that interact with users, execute tasks, and utilize tools. Each agent type is tailored for specific use cases, from conversational assistants to specialized task execution.
Overview¶
- Agents in Sikka Agent follow a consistent architecture:
- They receive input (messages, queries, or tasks)
- Process this input through context management
- Optionally use tools to gather information or perform actions
- Generate responses through LLM inference
- Maintain conversation history through memory systems
Agent Types¶
ChatAgent¶
A versatile agent designed for multi-turn conversations with built-in memory and tool-calling capabilities.
Parameters¶
Parameter | Type | Description | Default |
---|---|---|---|
system_prompt |
str or BaseMessage |
Instructions defining agent behavior | None |
model |
str or ModelConfigure |
The language model configuration | 'gpt-4o-mini' |
role_name |
str |
Name of the assistant's role | 'assistant' |
role_type |
RoleType |
Type of role (system, user, assistant) | RoleType.ASSISTANT |
token_limit |
int |
Maximum tokens for context | None |
window_size |
int |
Maximum number of messages to keep | None |
memory |
Memory |
Memory system for context retention | New Memory instance |
tools |
List[Union[FunctionTool, Callable]] |
Tools the agent can use | [] |
knowledge |
AgentKnowledge |
Knowledge base for information retrieval | None |
add_references |
bool |
Include traditional RAG in responses | False |
num_references |
int |
Number of references to include | 5 |
search_knowledge |
bool |
Include agentic RAG in responses | json |
references_format |
str |
Format for references (json or yaml ) |
json |
Returns¶
str
: The generated response from the agent
Code Example¶
from sikkaagent.agents import ChatAgent
from sikkaagent.models import ModelConfigure
from sikkaagent.utils.enums import ModelPlatformType
# Initialize with Ollama
model = ModelConfigure(
model="llama3.1:8b",
model_platform=ModelPlatformType.OLLAMA,
url="http://localhost:11434/v1"
)
agent = ChatAgent(
model=model,
system_prompt="You are a Python expert that provides clear, concise code examples."
)
# Single interaction
response = agent.step("How do I read a CSV file in Python?")
print(response.msgs[0].content)
# Continuing the conversation
response = agent.step("How would I modify this to handle missing values?")
print(response.msgs[0].content)
Adding Knowledge Retrieval¶
Enable the agent to access and reference external information:
from sikkaagent.agents import ChatAgent
from sikkaagent.knowledge import AgentKnowledge
from sikkaagent.document import Document
from sikkaagent.storages.in_memory import InMemoryStorage
from sikkaagent.retrievers.embedder.sentence_transformer import SentenceTransformerEmbedder
# Create embedder for vector search
embedder = SentenceTransformerEmbedder(
model_name="sentence-transformers/all-MiniLM-L6-v2"
)
# Set up in-memory vector database
vector_db = InMemoryStorage(embedder=embedder)
# Create chunking strategy
from sikkaagent.document.chunking.fixed import FixedSizeChunking
chunking_strategy = FixedSizeChunking(chunk_size=500, chunk_overlap=50)
# Initialize knowledge with the vector database
knowledge = AgentKnowledge(
vector_db=vector_db,
chunking_strategy=chunking_strategy,
num_documents=5,
)
# Load documents into the knowledge base
knowledge.load_documents([
Document(
name="doc1",
content="Machine learning is a subset of AI that enables computers to learn from data."
)
])
# Create agent with knowledge capabilities
agent = ChatAgent(
model=model,
knowledge=knowledge,
add_references=True, # Add retrieved references to messages
num_references=3, # Number of references to include
references_format="json" # Format for references
)
# Agent will now use the knowledge base
response = agent.step("Explain machine learning")
print(response.msgs[0].content)
TaskSpecifyAgent¶
Refines tasks by adding specificity and additional details.
Parameters¶
Parameter | Type | Description | Default |
---|---|---|---|
model |
str or ModelConfigure |
The language model configuration | None (uses default) |
task_type |
TaskType |
Type of task (CODE, TRANSLATION, etc.) | TaskType.CODE |
task_specify_prompt |
str or TextPrompt |
Custom prompt template | None (uses default) |
word_limit |
int |
Maximum word limit for specification | 50 |
Returns¶
TextPrompt
: The specified/refined task
Code Example¶
from sikkaagent.agents.task_agents import TaskSpecifyAgent
from sikkaagent.models import ModelConfigure
from sikkaagent.utils.enums import ModelPlatformType, TaskType
# Initialize model
model = ModelConfigure(
model="llama3.1:8b",
model_platform=ModelPlatformType.OLLAMA,
url="http://localhost:11434/v1"
)
# Create task agent
task_agent = TaskSpecifyAgent(
model=model,
task_type=TaskType.CODE,
word_limit=50
)
# Use run method (not step)
task = "Create a function to sort a list"
specified_task = task_agent.run(task)
print(specified_task)
# Example output: "Create a Python function that sorts a list of integers
# in ascending order using the built-in sort() method with proper error handling."
TaskPlannerAgent¶
Breaks down complex tasks into manageable subtasks.
Parameters¶
Parameter | Type | Description | Default |
---|---|---|---|
model |
str or ModelConfigure |
The language model configuration | None (uses default) |
max_task_num |
int |
Maximum number of subtasks | 5 |
task_prompt_template |
str |
Custom prompt template | None (uses default) |
Returns¶
TextPrompt
: The subtasks generated by the agent
Code Example¶
from sikkaagent.agents.task_agents import TaskPlannerAgent
from sikkaagent.models import ModelConfigure
from sikkaagent.utils.enums import ModelPlatformType
# Initialize model
model = ModelConfigure(
model="llama3.1:8b",
model_platform=ModelPlatformType.OLLAMA,
url="http://localhost:11434/v1"
)
# Create planner
planner = TaskPlannerAgent(
model=model,
max_task_num=5
)
# Get subtasks
subtasks = planner.run("Build a personal website portfolio")
print(subtasks)
TaskCreationAgent¶
Creates new tasks based on an objective and existing tasks, useful for building task queues.
Parameters¶
Parameter | Type | Description | Default |
---|---|---|---|
role_type |
str |
Role name of the agent | Required |
objective |
str or TextPrompt |
The main objective | Required |
model |
str or ModelConfigure |
The language model configuration | None (uses default) |
message_window_size |
int |
Number of recent messages to keep | None |
max_task_num |
int |
Maximum number of tasks to generate | 3 |
Returns¶
List[str]
: New tasks that don't overlap with existing tasks
Code Example¶
from sikkaagent.agents.task_agents import TaskCreationAgent
from sikkaagent.models import ModelConfigure
from sikkaagent.utils.enums import ModelPlatformType
# Initialize model
model = ModelConfigure(
model="llama3.1:8b",
model_platform=ModelPlatformType.OLLAMA,
url="http://localhost:11434/v1"
)
# Create task creation agent
task_creator = TaskCreationAgent(
role_type="Data Scientist",
objective="Analyze customer purchase patterns",
model=model,
max_task_num=3
)
# Get tasks based on existing tasks
existing_tasks = ["Clean the dataset", "Create visualizations"]
new_tasks = task_creator.run(existing_tasks)
for i, task in enumerate(new_tasks, 1):
print(f"New Task {i}: {task}")
TaskPrioritizationAgent¶
Prioritizes a list of tasks based on their importance to achieving an objective.
Parameters¶
Parameter | Type | Description | Default |
---|---|---|---|
objective |
str or TextPrompt |
The main objective | Required |
model |
str or ModelConfigure |
The language model configuration | None (uses default) |
message_window_size |
int |
Number of recent messages to keep | None |
Returns¶
List[str]
: Prioritized list of tasks
Code Example¶
from sikkaagent.agents.task_agents import TaskPrioritizationAgent
from sikkaagent.models import ModelConfigure
from sikkaagent.utils.enums import ModelPlatformType
# Initialize model
model = ModelConfigure(
model="llama3.1:8b",
model_platform=ModelPlatformType.OLLAMA,
url="http://localhost:11434/v1"
)
# Create task prioritization agent
prioritizer = TaskPrioritizationAgent(
objective="Build a responsive e-commerce website",
model=model
)
# Prioritize tasks
tasks = [
"Set up payment processing",
"Design user interface",
"Create product database",
"Implement search functionality",
"Configure server environment"
]
prioritized_tasks = prioritizer.run(tasks)
for i, task in enumerate(prioritized_tasks, 1):
print(f"Priority {i}: {task}")
Creating Custom Agents¶
Create custom agents by extending the base classes:
from sikkaagent.agents import ChatAgent, BaseAgent, BaseMessage
from sikkaagent.utils.enums import RoleType
class CustomerSupportAgent(ChatAgent):
def __init__(self, model, knowledge_base, **kwargs):
super().__init__(
model=model,
system_prompt="You are a customer support agent specialized in...",
**kwargs
)
self.knowledge_base = knowledge_base
def step(self, user_message):
# Retrieve relevant information from knowledge base
kb_results = self.knowledge_base.search(user_message)
# Create enhanced message with knowledge base results
enhanced_message = f"{user_message}\n\nRelevant information: {kb_results}"
# Create and add a user message
message = BaseMessage.make_user_message(content=enhanced_message)
self.add_message(message, RoleType.USER)
# Use parent class's implementation
return super().step(enhanced_message)
Best Practices¶
- Contextual System Prompts: Define clear, specific instructions in system prompts
- Tool Selection: Only provide relevant tools to the agent
- Memory Management: Clear memory when starting new conversation topics
- Error Handling: Implement proper error handling for tool failures
- Response Validation: Validate responses for sensitive applications