Agent Configuration
Initial Prompt
Set up an agent for customer support queries with Llama 3b.
Output
user_proxy (to system): Configure customer support agent.
system (to user_proxy):
{
"agent_config": {
"name": "support_assistant",
"model": "llama-3b",
"temperature": 0.7,
"system_prompt": """
You are a customer support specialist.
- Be helpful and professional
- Follow company guidelines
- Escalate when needed
"""
},
"tools": [
"ticket_system",
"knowledge_base",
"escalation_workflow"
]
}
user_proxy (to system): Agent configured successfully.
Last updated