How to Build Your Own Panel AI Chatbots

Andrew Huang and Sophia Yang

With its latest version 1.3, the open-source project Panel has just introduced an exciting and highly anticipated new feature: the Chat Interface widget. This new capability has opened up a world of possibilities, making the creation of AI chatbots more accessible and user-friendly than ever before. 

In this post, you’ll learn how to use the ChatInterface widget and to build:

  • A basic chatbot
  • An OpenAI ChatGPT-powered AI chatbot 
  • A LangChain-powered AI chatbot 

Before we get started, you will need to install Panel (any version greater than 1.3.0) and other packages you might need like jupyterlab, openai, and langchain.

Now you are ready to go! 

Use the ChatInterface widget

The brand new ChatInterface widget is a high-level widget, providing a user-friendly chat interface to send messages with four build-in operations:

  • Send: Send messages to the chat log
  • Rerun: Resend the most recent user message 
  • Undo: Remove the most recent messages
  • Clear: Clear all chat messages

Curious to know more about how `ChatInterface` works under the hood? It’s a high-level widget that wraps around the middle-level widget `ChatFeed` that manages a list of `ChatMessage` items for displaying chat messages. Check out the docs on ChatInterface, ChatFeed and ChatMessage to learn more. 

1. Build a basic chatbot

With ``, we can send messages to the chat interface, but how should the system reply? We can define a `callback` function! 

In this example, our `callback` function simply echoes back a user message. See how it’s becoming more functional already?

How to use the ChatInterface widget to echo back a message:

import panel as pn


def callback(contents: str, user: str, instance:
    message = f"Echoing {user}: {contents}"
    return message

chat_interface =, callback_user="System")
chat_interface.send("Send a message to receive an echo!", user="System", respond=False)

To serve the app, run `panel serve` or `panel serve app.ipynb`. 

2. Build a ChatGPT-powered AI chatbot 

How can we use OpenAI ChatGPT to reply to messages? We can simply call the OpenAI API in the `callback` function. 

Install `openai` in your environment and add your OpenAI API key to the script. Note that in this example, we added `async` to the function to allow collaborative multitasking within a single thread and allow IO tasks to happen in the background. This ensures that our app runs smoothly while waiting for OpenAI API responses. Async enables concurrent execution, allowing us to perform other tasks while waiting and ensuring a responsive application.

How to use the ChatInterface widget to create a chatbot using OpenAI’s GPT-3 API:

import openai
import panel as pn


openai.api_key = "Add your key here"

async def callback(contents: str, user: str, instance:
    response = openai.ChatCompletion.create(
        messages=[{"role": "user", "content": contents}],
    message = ""
    for chunk in response:
        message += chunk["choices"][0]["delta"].get("content", "")
        yield message

chat_interface =, callback_user="ChatGPT")
    "Send a message to get a reply from ChatGPT!", user="System", respond=False

3. Build a LangChain-powered AI chatbot 

The Panel ChatInterface also seamlessly integrates with LangChain, leveraging the full spectrum of LangChain’s capabilities. 

Here is an example of how we use LangChain’s `ConversationChain` with the `ConversationBufferMemory` to store messages and pass previous messages to the OpenAI API. 

Again, please remember to make sure to install `langchain` in your environment and add your OpenAI API key in the script. 
Note that before we dive into the LangChain code, we defined a `callback_handler` because LangChain interface does not have a way to stream from generators, so we need to wrap the LangChain interface with ``, which inherits from `langchain.callbacks.base.BaseCallbackHandler`. For more information, check out the documentation.

How to use the ChatInterface widget to create a chatbot using OpenAI’s GPT-3 API with LangChain:

import os

import panel as pn

from langchain.chat_models import ChatOpenAI
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory


os.environ["OPENAI_API_KEY"] = "Type your API key here"

async def callback(contents: str, user: str, instance:
    await chain.apredict(input=contents)

chat_interface =, callback_user="ChatGPT")
callback_handler =
llm = ChatOpenAI(streaming=True, callbacks=[callback_handler])
memory = ConversationBufferMemory()
chain = ConversationChain(llm=llm, memory=memory)

    "Send a message to get a reply from ChatGPT!", user="System", respond=False)


In this blog post, we’ve taken an in-depth look at the exciting new ChatInterface widget in Panel. We started by guiding you through building a basic chatbot using ``. We elevated your chatbot’s capabilities from there by seamlessly integrating OpenAI ChatGPT. To further enhance your understanding, we also explored the integration of LangChain with Panel’s ChatInterface. If you’re eager to explore more chatbot examples, don’t hesitate to visit this GitHub repository and consider contributing your own.

Armed with these insights and hands-on examples, you’re now well-prepared to embark on your journey of crafting AI chatbots in Panel. Happy coding!

All of these tools are open source and free for everyone to use, but if you’d like to get started with help from Anaconda’s AI and Python app experts, contact a representative from our sales team at [email protected].

Talk to an Expert

Talk to one of our experts to find solutions for your AI journey.

Talk to an Expert