Configuring a Q&A Intelligent Customer Service Agent can be divided into the following steps:
1. Basic preparations
- Add LLM (e.g., GPT-3.5) and Embedding model in [Model Configuration].
- Connect the knowledge base (e.g. Milvus) in [Vector Library Configuration] and associate Embedding models
2. Agent construction
- Drag and drop the "start node" to set the initial state (e.g., the messages variable).
- Add an "input node" to receive user questions and store them in messages.
- Configure "Vector Recall Node": Associate a vector library to retrieve relevant content from the knowledge base.
- Add "LLM node": select preconfigured model, set system prompts (e.g. "You are a customer service assistant"), user prompts combining original question and recall results (e.g. {{question}} + {{search_results}})
- Connect with default edges: input node → vector recall → LLM node
3. Enhancements(Optional)
- Add a "counter node" to control the number of interaction rounds.
- Setting up conditional edges to implement complex logic (e.g., more than 3 rounds to manual)
- Configure the MCP tool for extended functionality such as work order creation.
This answer comes from the articleLang-Agent: a LangGraph-based platform for visualizing the configuration of AI intelligencesThe