Building a Reliable Multi-Agent customer support chatbot to handle banking operations using LangGraph and CrewAI. Part -1

Pandurang Zambare
3 min readSep 23, 2024

--

Today LLM’s with their enormous Natural Language processing and generation capabilities have amazed us. Everyone is trying to use it to automate the business process, work flow in some or the other way. The most significant contributor in developing LLM applications is Multi agent LLM Frameworks Like AutoGen, Langchain, LangGraph, MetaGPT, AutoGPT, CrewAI etc.. to achieve the bigger job done.

We will be using Lang chain, LangGraph and CrewAI. There are hundreds of articles explaining how to use of frameworks, agents and defining the work flow. However when it comes to actual implementation with open source LLM and integration of those agents, developers come across many challenges particularly when handling Human in loop or interaction among agents, memory and chat history handling. We will use open source LLM from llama3 hosted by ChatGroq to keep development or learning costs minimum. With Lang chain this model or model provider can be replaced to your liking without much overhead.

This article is my effort to solve those challenges and ways to integrate LangGraph agents and CrewAI agents to achieve a bigger more complex job done.

Use Case

We will develop a Bank customer support workflow which will handle -

1) General user queries related to bank services or other information related to bank or affiliates etc.

2) User can enquire about his last loan application or apply for new loan.

Framework

  1. We will use Groq APIs with open source LLM model “llama3–8b-8192
  2. We will also use FAISS vector library by Meta for index creation and similarity search and then storing the data in local.

Architecture

Our chatbot system comprises of 4 CrewAI Crew/ Node (for now), LLM and a vector store (FAISS).

For general bank services and products enquiries we use RAG having access to PDF/txt files etc containing information related to Bank.

For Loan application we use Agentic LLM approach.

In simplest form the overall architecture is as below-

Crew/Node Roles-

  1. Router Crew:- This CrewAI crew is the main supervisor agent who handles all incoming messages from user and routes each message to sub agents who can handle it best way.
  2. AI Assistant Crew: — This is the RAG based customer support crew, it has access to banks services, locations, branches, knowledge base etc. and can handle users questions related to bank.
  3. Authenticator Crew: — This is the authenticator crew who handles user authentication. Before user can enquire about his previous loan application or apply for new loan loan, user need to be authenticated with his mobile and OTP.
  4. Loan Manager Crew:- This agent handles previous loan applications status enquiries or new loan application handing. This crew can take over the flow only after user is authenticated.

In this post series, we will discuss main components and their interactions with other components. RAG agent and its implementation is out of scope of this article and will be discussed in a separate article. The top level python code is as below-

Line 27–30 — We create LangGraph state with “MessagesState” holding all past messages list and

user_info — Authenticated users information object.

route — Current Agents working on user message.

question — Current user message. For usage among CrewAI agents.

Line 32–86: Here we create Lang Graph workflow using CrewAI agents and nodes and define the edge cases. Chat history will be saved by using “checkpointer” when we compile the workflow.

Line 88 — END: Graph flow testing using user message streaming.

Stay tuned for Part 2 of this series. We will cover Router and Assistant crew design and related code.

--

--