February 13, 2026
This blog post has also been published (with some minor changes in the introduction done by the marketing team) on the deepsense.ai blog.
Ragbits is an open-source framework for LLM-based applications. It was developed internally at deepsense.ai and is being used in commercial projects. Based on our experience in making production-ready applications, we add new features that will make development of similar projects easier.
In this blog post, we will write an application using Ragbits, and extend it with new features available thanks to Ragbits 1.4.
In this post, we will start with a simple ragbits application, and one-by-one add new features. The full code can be found on github:
from ragbits.chat.api import RagbitsAPI, ChatInterface
from ragbits.chat.interface.types import ChatContext, TextContent, TextResponse
from ragbits.core.prompt import ChatFormat
from ragbits.core.llms import LiteLLM
class SimpleStreamingChat(ChatInterface):
def __init__(self):
self.llm = LiteLLM(model_name="gpt-5.2")
async def chat(self, message: str, history: ChatFormat, context: ChatContext):
conversation_history = [
{"role": "system", "content": "Answer everything relatively shortly"},
*history,
{"role": "user", "content": message}
]
result = self.llm.generate_streaming(conversation_history)
async for event in result:
yield self.create_text_response(event)
api = RagbitsAPI(
SimpleStreamingChat,
cors_origins=["http://localhost:8000", "http://127.0.0.1:8000"]
)
app = api.appHere is how to app looks like out of the box:

To run it, make sure that you installed the new version of ragbits:
pip install ragbits-core==1.4.1 ragbits-chat==1.4.1or
uv add ragbits-core==1.4.1 ragbits-chat==1.4.1Also make sure that you have appropriate environment variables installed. You will need OPENAI_API_KEY (see env.template in the repository).
Release 1.4 introduces a bunch changes to the UI, that often are are required when developing with LLMs. That includes:
OAuth 2.0 is a standard protocol for delegated authorization - allowing an application to get access to another system's resources with the approval of the user. In particular, it can be used to authenticate a user with third party services. Developers who want to implement it in their application need to be familiar with the flow of requests and redirects defined in the standard:
https://accounts.google.com/o/oauth2/v2/auth)callback endpoint sending an authorization codeIn order to make it work, the developer also needs to manage things like state parameter (sent to the authorization server and back), storing the states, verifying them, sending appropriate redirect_uri, and so on.
Ragbits automates all of that. In order to perform an authentication with Google, all you need is to specify a Google AuthenticationBackend defined in
Ragbits:
from ragbits.chat.auth.session_store import InMemorySessionStore
from ragbits.chat.auth.oauth2_providers import OAuth2Providers
from ragbits.chat.auth.backends import OAuth2AuthenticationBackend
auth_backend = OAuth2AuthenticationBackend(
session_store=InMemorySessionStore(),
provider=OAuth2Providers.GOOGLE,
redirect_uri="http://127.0.0.1:8000/api/auth/callback/google"
)We need to make sure to define the GOOGLE_CLIENT_ID and GOOGLE_CLIENT_SECRET environment variables.
Then, we just pass that auth_backend to the RagbitsAPI:
api = RagbitsAPI(
SimpleStreamingChat,
cors_origins=["http://localhost:8000", "http://127.0.0.1:8000"],
auth_backend=auth_backend,
)That's it! You can run the app with RAGBITS_BASE_URL=http://127.0.0.1:8000 uvicorn main:app.
If you want, you can also replace that backend with any other authentication backend. In particular, adding other OAuth 2.0 providers is
as simple as specifying a few required parameters:
GOOGLE = OAuth2Provider(
name="google",
display_name="Google",
authorize_url="https://accounts.google.com/o/oauth2/v2/auth",
token_url="https://oauth2.googleapis.com/token",
user_info_url="https://www.googleapis.com/oauth2/v2/userinfo",
scopes=["openid", "email", "profile"],
user_factory=lambda user_data: User(
user_id=f"google_{user_data['id']}",
username=str(user_data.get("email", "")).split("@")[0],
email=str(user_data["email"]) if user_data.get("email") else None,
full_name=str(user_data["name"]) if user_data.get("name") else None,
roles=["user"],
metadata={
"provider": "google",
"picture": user_data.get("picture"),
"verified_email": user_data.get("verified_email"),
"hd": user_data.get("hd"), # Google Workspace domain
},
),
)Here is how to final app looks like:

One of the common use-cases for custom LLM solutions is the possibility of working with internal data, that is not available to the LLM.
This is why RAG solutions are so popular. Ragbits UI allows the user to upload their own file, which can be handled in the backend
using an upload_handler. Below we demonstrate how to implement it.
The idea is to allow the user to upload files from their computer. We will give the LLM access to it in a system prompt. For the purposes of the
demo, the files will be stored in a Python dictionary. In a production settings, a more advanced solutions can be used, including
the ones available via ragbits-document-search package. For now, let's focus on simplicity.
In the __init__ method, we add a self.files attribute:
def __init__(self):
self.llm = LiteLLM(model_name="gpt-5.2")
self.files = {}The upload handler will simply add the content of the file to the dictionary.
async def upload_handler(self, file: UploadFile) -> None:
content = await file.read()
filename = file.filename
self.files[filename] = contentFinally, the chat method will include the content of all the available files in the system prompt:
files_content = "\n\n".join([f"Title: {title}\n\nContent: {content}" for title, content in self.files.items()])
conversation_history = [
{"role": "system", "content": "Answer everything shortly"},
{"role": "system", "content": f"Available files:\n\n{files_content}"},
*history,
{"role": "user", "content": message}
]The result is as follows:

The architecture of Ragbits is that the ChatInterface is responsible for communicating with Ragbits UI. The API sends
a stream of Server Side Events that the UI needs to handle. self.create_text_response is an example of such event of type "text".
Ragbits allows sending multiple types of such events. Since 1.4, error is one of them.
Let's create a UI that will reject users who swear and return an error on such messages.
from ragbits.chat import ErrorResponse, ErrorContent
async def chat(self, message: str, history: ChatFormat, context: ChatContext):
if "freak" in message.lower():
yield ErrorResponse(content=ErrorContent(message="Message rejected. Be nice!"))
return
The idea above of handling an arbitrary event can be extended to custom responses we can define ourselves. This may be very useful, if our backend needs to send some commands to the UI (or any other client). We won't cover implementing custom UI handlers in this post, but we will show how to send custom commands from the backend.
from ragbits.chat import ChatResponse, ResponseContent
class CustomContent(ResponseContent):
name: str
age: int
def get_type(self) -> str:
return "custom"
class CustomResponse(ChatResponse[CustomContent]):
"""Custom response that the RagbitsAPI can stream"""To use them, simply yield them in the chat method:
yield CustomResponse(content=CustomContent(name="John", age=30))By default, ragbits looks modern and elegant from the start. However, you can customize it to your needs without having to modify the UI code.
Release 1.4 introduced the possibility to use custom themes. You can choose among any HeroUI themes or write your own
according to HerokuUI documentation. Here, we will use a "coffee" theme. We download it and save in "theme.json". To use it, simply pass the
theme_path parameter to the RagbitsAPI:
api = RagbitsAPI(
SimpleStreamingChat,
cors_origins=["http://localhost:8000", "http://127.0.0.1:8000"],
auth_backend=auth_backend,
theme_path="theme.json"
)When testing out multiple themes, remember to clean the cookies in your browser.
Additionally, you may modify the titles and texts on the package with custom messages. That inculdes things like a header, title of a page, welcome message, favicon, icon and logo.
from ragbits.chat.interface.ui_customization import UICustomization, HeaderCustomization, PageMetaCustomization
class SimpleStreamingChat(ChatInterface):
ui_customization = UICustomization(
header=HeaderCustomization(title="My Custom Ragbits App", subtitle="demo for release 1.4"),
welcome_message="Ask any question about **React component libraries**",
meta=PageMetaCustomization(page_title="Demo")
)Here is the final effect:

To have a full control over the UI, you need to clone the Ragbits' Typescript code, but for basic customization, existing options work quite well.
We hope that this post will be useful to you in developing LLM-based applications! There are more releases of Ragbits coming up, the next one focusing more on the agentic features.
The roadmap is as follows:
To see the latest developments in Ragbits, check out github.com/deepsense-ai/ragbits. Feel free to reach out and propose new features.