Building an Intuitive User Interface for CrewAI Applications
In this article, we will guide you through the process of creating an intuitive and visually appealing user interface (UI) for applications built with CrewAI, a sophisticated multi-agent framework. Our goal is to enhance the user experience, ensuring that your CrewAI applications and demos are more engaging and user-friendly. Whether you’re developing for casual users or tech enthusiasts, this step-by-step guide will help you implement a seamless and effective UI.
Enhancing User Experience
In this demonstration users will be able to input a topic for a writing task, and the UI will showcase how the orchestrator agent sequentially prompts the writer agent and reviewer agent, displaying the content generated by each agent along the way.
For this application, we utilize the Streamlit framework to build the web app. Streamlit’s chat widgets effectively simulate a studio environment, facilitating group chats among several LLM-powered agents in an automated manner. The chat session concludes once the final answer is generated by the collective effort of the agents, providing a streamlined and engaging user experience.
Understanding the Function Blocks
Before we explore the system functions and their interconnections, it’s essential to understand the fundamental technologies we will be using: Streamlit and CrewAI.
CrewAI
In the realm of Large Language Model (LLM) applications, multi-agent architectures are increasingly outperforming solo agents. CrewAI distinguishes itself with a practical, production-ready approach, making it a standout choice for developing sophisticated multi-agent systems.
Unlike other multi-agent frameworks, CrewAI emphasizes efficiency and reliability by clearly defining the roles and responsibilities of each AI agent. This ensures that interactions between agents are both streamlined and predictable. One of CrewAI’s standout features is its seamless integration with LangChain, which allows developers to utilize existing tools and resources for effective agent management and development. This combination of clarity, integration, and ease of use makes CrewAI an attractive option for those looking to build robust and user-friendly multi-agent LLM applications.
Streamlit
Streamlit revolutionizes the development of web-based data applications with its intuitive Python framework and a wide array of widgets. One of its most notable recent features is the chat widget, which simplifies the integration of conversational interfaces into applications. This widget allows users to engage with the LLM model through a natural and interactive chat format, making interactions more intuitive and engaging. Streamlit’s user-friendly design and robust chat functionality make it an excellent choice for developers aiming to quickly build and deploy dynamic LLM-powered applications with interactive conversational elements.
Block Diagram
The core link between Streamlit and CrewAI is established through a callback mechanism that manages various events throughout the stages of agent and task processing within CrewAI’s automation. This mechanism enables us to utilize the chat_message()
method effectively, allowing us to redirect response outputs directly to the Streamlit interface. By leveraging this callback handler, we can ensure that the dynamic interactions and responses from CrewAI are seamlessly integrated and displayed within the Streamlit-based web application.
Code Walkthrough
Let’s delve into the code and understand how it integrates CrewAI with Streamlit.
Installation and Import
First, ensure you have the necessary packages installed. Update to the latest CrewAI (≥ 0.22.5) and install Streamlit:
!pip install --upgrade crewai streamlit
Next, import the required packages:
import streamlit as st
from crewai import Crew, Process, Agent, Task
from langchain_core.callbacks import BaseCallbackHandler
from typing import TYPE_CHECKING, Any, Dict, Optional
from langchain_openai import ChatOpenAI
Initializing CrewAI
- LLM Configuration Set up the language model with
ChatOpenAI
:
from langchain_openai import ChatOpenAI
llm = ChatOpenAI()
- Agent Creation Create two agents: a writer for generating blog posts and a reviewer for providing feedback.
writer = Agent(
role='Blog Post Writer',
backstory='''You are a blog post writer who specializes in creating travel blogs.
You produce one iteration of an article at a time.
You do not provide review comments but are open to feedback and willing to revise the article based on suggestions.
''',
goal="Write and iterate a quality blog post.",
llm=llm,
callbacks=[MyCustomHandler("Writer")],
)
reviewer = Agent(
role='Blog Post Reviewer',
backstory='''You are a professional reviewer who excels at enhancing articles.
You provide recommendations for improving articles based on user requests.
You will give feedback only after reading the entire article and will not generate content by yourself.
''',
goal="Identify areas for improvement in a blog post and provide actionable feedback. Do not comment on summaries or abstracts.",
llm=llm,
callbacks=[MyCustomHandler("Reviewer")],
)
Each agent is configured with a specific role, backstory, and goal, and is linked to a custom callback handler.
- Custom Callback Handler In CrewAI, each task is executed within a customized Langchain chain known as the
CrewAgentExecutor Chain
. To display the most relevant messages on the Streamlit UI, focus on capturing the input and output messages for each chain.
- Input Message: Generated by the orchestrator, which guides and coordinates the worker agents.
- Output Message: Produced by each worker agent in response to the orchestrator’s requests. The callback handler (
MyCustomHandler
) will manage these messages and redirect them to the Streamlit interface.
Implementing the Callback Handler
To control the display of messages from CrewAI within Streamlit, we use CrewAI’s callback handler to manage events at the start and end of each chain. By customizing the BaseCallbackHandler
class, we can effectively capture and display messages as they flow through the system.
Here’s how to define the custom callback handler:
avators = {
"Writer": "https://cdn-icons-png.flaticon.com/512/320/320336.png",
"Reviewer": "https://cdn-icons-png.freepik.com/512/9408/9408201.png"
}
class MyCustomHandler(BaseCallbackHandler):
def __init__(self, agent_name: str) -> None:
self.agent_name = agent_name
def on_chain_start(
self, serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any
) -> None:
"""Handle the start of a chain execution."""
st.session_state.messages.append({"role": "assistant", "content": inputs['input']})
st.chat_message("assistant").write(inputs['input'])
def on_chain_end(self, outputs: Dict[str, Any], **kwargs: Any) -> None:
"""Handle the end of a chain execution."""
st.session_state.messages.append({"role": self.agent_name, "content": outputs['output']})
st.chat_message(self.agent_name, avatar=avators[self.agent_name]).write(outputs['output'])
In this implementation:
on_chain_start
is triggered at the beginning of a chain, appending the input message to the session state and displaying it on the page.on_chain_end
is triggered at the end of a chain, appending the output message to the session state and displaying it on the page with the appropriate agent name and avatar.
Web Page Creation
To create the Streamlit interface and manage message flow, follow these steps:
st.title("💬 CrewAI Writing Studio")
# Initialize message history in session state if it does not exist
if "messages" not in st.session_state:
st.session_state["messages"] = [{"role": "assistant", "content": "What blog post would you like us to write?"}]
# Display messages from session state
for msg in st.session_state.messages:
st.chat_message(msg["role"]).write(msg["content"])
In this setup:
- We set the title of the Streamlit app.
- We initialize the
messages
key inst.session_state
if it does not already exist, starting with a prompt asking what blog post the user wants to write. - We loop through
st.session_state.messages
and display each message usingst.chat_message()
, ensuring that the conversation history is preserved and displayed correctly.
Handling User Input and Prompting the Group Chat
To manage user input and trigger the group chat functionality, use the st.chat_input()
function. Here’s how you can integrate it into your Streamlit app:
if prompt := st.chat_input():
# Add user input to the session state and display it in the chat
st.session_state.messages.append({"role": "user", "content": prompt})
st.chat_message("user").write(prompt)
# Define tasks for the writer and reviewer agents
task1 = Task(
description=f"""Write a blog post on the topic: {prompt}""",
agent=writer,
expected_output="An article under 300 words."
)
task2 = Task(
description="""List review comments for improvement to make the blog post more engaging on social media""",
agent=reviewer,
expected_output="Key points about where the article needs improvement."
)
# Set up the CrewAI process with the defined tasks and agents
project_crew = Crew(
tasks=[task1, task2],
agents=[writer, reviewer],
manager_llm=llm,
process=Process.hierarchical # Hierarchical management approach
)
# Start the CrewAI process
final = project_crew.kickoff()
# Display the final result
result = f"## Here is the Final Result \n\n {final}"
st.session_state.messages.append({"role": "assistant", "content": result})
st.chat_message("assistant").write(result)
In this implementation:
- When the user provides input, it is appended to the session state and displayed in the chat.
- Two tasks are defined: one for the writer agent to create a blog post and another for the reviewer agent to provide feedback.
- The
Crew
class is used to manage and execute these tasks in a hierarchical process, allowing the orchestrator to manage the sequence autonomously. - The final result is generated and displayed once the tasks are completed.
Conclusion & Future Improvements
This demonstration provides a basic implementation of visualizing CrewAI processes with Streamlit. It sets up a structured messaging flow to support multi-agent program development.
To enhance this project further and better align it with CrewAI’s advanced features, consider integrating:
- Human Interaction: Incorporate more interactive elements for user engagement.
- Tool Outputs: Display results from various tools used by agents.
- Retrieval-Augmented Generation (RAG): Incorporate retrieval-based techniques to enhance content generation.
- Multi-Modal Outputs: Support different types of media and content formats.
Stay tuned for more tutorials on these topics to expand on the functionalities and capabilities of CrewAI applications.