Building a Customer Support Bot with LangGraph: A Practical Example
Imagine an AI that can handle customer queries with patience, remember past interactions, and keep trying until it solves the problem—like a top-notch support agent. LangGraph, a versatile library from the LangChain team, makes this possible with its stateful, graph-based workflows. In this beginner-friendly guide, we’ll walk you through building a customer support bot using LangGraph that processes user issues, suggests solutions, and maintains context for a seamless experience. With clear code examples, a conversational tone, and practical steps, you’ll create a chatbot ready to tackle customer queries, even if you’re new to coding!
What is a Customer Support Bot in LangGraph?
A customer support bot in LangGraph is an AI application designed to:
- Accept user-reported issues (e.g., “My printer won’t print”).
- Suggest solutions using a language model (like those from OpenAI).
- Check if the solution worked, looping back to try again if needed.
- Store conversation history to provide context-aware responses.
LangGraph’s nodes (tasks), edges (connections), and state (shared data) enable a dynamic workflow that adapts to user needs, making it ideal for interactive support scenarios.
This example builds on LangGraph’s flexibility to create a robust support bot, which you can extend with tools or agent logic. To get started with LangGraph, see Introduction to LangGraph.
What You’ll Build
Our customer support bot will: 1. Take a user’s issue (e.g., “My printer won’t print”). 2. Store the issue in conversation history. 3. Suggest a solution using an AI model, considering the history. 4. Check if the solution resolved the issue (simulated for this example). 5. Loop back to suggest another solution if unresolved, up to three attempts. 6. End when the issue is resolved or attempts are exhausted.
We’ll use LangGraph for the workflow and LangChain for memory, AI, and prompt management.
Prerequisites
Before starting, ensure you have:
- Python 3.8+: Installed and verified with python --version.
- LangGraph and LangChain: Installed via pip.
- OpenAI API Key: For the language model (or use a free model from Hugging Face).
- Virtual Environment: To manage dependencies.
Install the required packages:
pip install langgraph langchain langchain-openai python-dotenv
Set up your OpenAI API key in a .env file:
echo "OPENAI_API_KEY=your-api-key-here" > .env
For setup details, see Install and Setup and Security and API Keys.
Building the Customer Support Bot
Let’s create a LangGraph workflow for the support bot. We’ll define the state, nodes, edges, and graph, then run it to simulate a customer support interaction.
Step 1: Define the State
The state holds the issue, solution, resolution status, conversation history, and attempt count to manage retries.
from typing import TypedDict
from langchain_core.messages import HumanMessage, AIMessage
class State(TypedDict):
issue: str # User’s issue (e.g., "Printer won't print")
solution: str # Suggested solution
is_resolved: bool # True if issue is fixed
conversation_history: list # List of HumanMessage and AIMessage
attempt_count: int # Number of solution attempts
The conversation_history ensures context-aware responses, and attempt_count prevents infinite loops. Learn more at State Management.
Step 2: Create Nodes
We’ll use four nodes:
- process_issue: Stores the user’s issue and initializes the history.
- suggest_solution: Generates a solution using an AI model and history.
- check_resolution: Evaluates if the solution worked (simulated).
- decide_next: Decides whether to end or retry.
from langchain_openai import ChatOpenAI
from langchain.prompts import PromptTemplate
import logging
# Setup logging for debugging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
# Node 1: Process user issue
def process_issue(state: State) -> State:
logger.info(f"Processing issue: {state['issue']}")
if not state["issue"]:
logger.error("Empty issue")
raise ValueError("Issue is required")
state["conversation_history"].append(HumanMessage(content=state["issue"]))
state["attempt_count"] = 0
logger.debug(f"Updated history: {state['conversation_history']}")
return state
# Node 2: Suggest a solution
def suggest_solution(state: State) -> State:
logger.info("Generating solution")
try:
llm = ChatOpenAI(model="gpt-3.5-turbo")
template = PromptTemplate(
input_variables=["issue", "history"],
template="Issue: {issue}\nConversation history: {history}\nSuggest a solution in one sentence."
)
history_str = "\n".join([f"{msg.type}: {msg.content}" for msg in state["conversation_history"]])
chain = template | llm
solution = chain.invoke({"issue": state["issue"], "history": history_str}).content
state["solution"] = solution
state["conversation_history"].append(AIMessage(content=solution))
state["attempt_count"] += 1
logger.debug(f"Solution: {solution}")
except Exception as e:
logger.error(f"Solution error: {str(e)}")
state["solution"] = f"Error: {str(e)}"
return state
# Node 3: Check resolution (simulated)
def check_resolution(state: State) -> State:
logger.info("Checking resolution")
# Simulate resolution: assume resolved if solution mentions "ink"
state["is_resolved"] = "ink" in state["solution"].lower()
if not state["is_resolved"]:
state["conversation_history"].append(HumanMessage(content="That didn't work"))
logger.debug(f"Resolved: {state['is_resolved']}")
return state
# Node 4: Decide next step
def decide_next(state: State) -> str:
if state["is_resolved"] or state["attempt_count"] >= 3:
logger.info("Ending workflow: resolved or max attempts reached")
return "end"
logger.info("Looping back to suggest another solution")
return "suggest_solution"
- process_issue: Validates the issue, adds it to history, and initializes attempt_count.
- suggest_solution: Uses the AI to suggest a solution, considering history, and updates the state.
- check_resolution: Simulates checking if the solution worked by looking for “ink” in the solution.
- decide_next: Decides to end or retry based on resolution or attempts.
For AI integration, see OpenAI Integration.
Step 3: Define Edges
The workflow flows as follows:
- Direct Edges: From process_issue to suggest_solution, then to check_resolution.
- Conditional Edge: From check_resolution, either end or loop back to suggest_solution.
Step 4: Build the Workflow
The graph connects nodes and edges:
from langgraph.graph import StateGraph, END
# Build the graph
graph = StateGraph(State)
graph.add_node("process_issue", process_issue)
graph.add_node("suggest_solution", suggest_solution)
graph.add_node("check_resolution", check_resolution)
graph.add_edge("process_issue", "suggest_solution")
graph.add_edge("suggest_solution", "check_resolution")
graph.add_conditional_edges("check_resolution", decide_next, {
"end": END,
"suggest_solution": "suggest_solution"
})
graph.set_entry_point("process_issue")
# Compile the graph
app = graph.compile()
Step 5: Run the Support Bot
Test the bot with a single issue:
from dotenv import load_dotenv
import os
load_dotenv()
# Run the workflow
try:
result = app.invoke({
"issue": "My printer won't print",
"solution": "",
"is_resolved": False,
"conversation_history": [],
"attempt_count": 0
})
print("Final Solution:", result["solution"])
print("Conversation History:", [msg.content for msg in result["conversation_history"]])
except Exception as e:
logger.error(f"Workflow error: {str(e)}")
Example Output:
Final Solution: Check the ink levels and replace the cartridge if empty.
Conversation History: [
"My printer won't print",
"Check the ink levels and replace the cartridge if empty."
]
Step 6: Simulate an Interactive Session
To make the bot interactive, create a loop for multiple interactions:
# Initialize state
state = {
"issue": "",
"solution": "",
"is_resolved": False,
"conversation_history": [],
"attempt_count": 0
}
# Interactive loop
print("Welcome to the Support Bot! Describe your issue or type 'exit' to quit.")
while True:
user_input = input("You: ")
if user_input.lower() in ["exit", "quit"]:
break
state["issue"] = user_input
result = app.invoke(state)
print("Support Bot:", result["solution"])
if result["is_resolved"]:
print("Issue resolved! Goodbye.")
break
if result["attempt_count"] >= 3:
print("Max attempts reached. Please contact support.")
break
state = result # Update state with new history
print("That didn't work? Let’s try another solution.")
Example Interaction:
Welcome to the Support Bot! Describe your issue or type 'exit' to quit.
You: My printer won't print
Support Bot: Ensure the printer is powered on and connected to your network.
That didn't work? Let’s try another solution.
You: It’s still not working
Support Bot: Check the ink levels and replace the cartridge if empty.
Issue resolved! Goodbye.
What’s Happening?
- The state persists the conversation_history, enabling context-aware solutions.
- Nodes process the issue, suggest solutions, check resolution, and decide next steps.
- Edges create a flow that loops back if unresolved, up to three attempts.
- The workflow is dynamic, with logging and error handling for robustness.
For more on dynamic flows, see Looping and Branching.
Debugging Common Issues
If the bot encounters issues, try these debugging tips:
- No Response: Verify the OPENAI_API_KEY is set. See Security and API Keys.
- Infinite Loop: Check attempt_count in decide_next to ensure the loop limit is enforced. See Graph Debugging.
- Missing History: Log conversation_history in process_issue to confirm messages are added.
- AI Errors: Ensure the prompt in suggest_solution is clear and handle API errors.
Enhancing the Support Bot
Extend the bot with LangChain features:
- Tools: Add a database query for printer details with SQL Database Chains or web searches with SerpAPI Integration. See Tool Usage.
- Agents: Enable the bot to choose actions dynamically with Agent Integration.
- Memory: Enhance context with Memory Integration.
- Prompts: Refine solutions with tailored prompts using Prompt Templates.
For example, add a node to fetch real-time troubleshooting tips with Web Research Chain.
To deploy the bot as an API, see Deploying Graphs.
Best Practices for Support Bots
- Focused Nodes: Keep each node single-purpose (e.g., input, solution, checking). See Workflow Design.
- Robust State: Validate state data to avoid errors. Check State Management.
- Clear Logging: Use logging to trace issues. See Graph Debugging.
- Limit Retries: Cap attempts to prevent endless loops. Check Looping and Branching.
- Test Scenarios: Try various issues to ensure context-awareness. See Best Practices.
Conclusion
Building a customer support bot with LangGraph is a fantastic way to harness stateful, graph-based workflows for real-world applications. By structuring the bot with nodes, edges, and a persistent state, you’ve created an AI that listens, suggests solutions, and adapts until the issue is resolved. This example is a stepping stone to more advanced bots with tools, agents, or cloud deployment.
To begin, follow Install and Setup and try this support bot. For more, explore Core Concepts or simpler projects like Simple Chatbot Example. For inspiration, check real-world applications at Best LangGraph Uses. With LangGraph, your support bot is ready to assist and impress!
External Resources: