Customizable local AI chat: A guide to OpenWebUI Installation and Setup

Customizable local AI chat: A guide to OpenWebUI Installation and Setup is a introduction and setup guide to install openwebui. It supports ollama so you have your own chat-GPT kind of interface locally

DOCKERBLOG

- Luminari

7/16/20241 min read

As technology continues to evolve, it's essential for developers to stay ahead of the curve and adapt to the changing landscape. One way to do this is by leveraging innovative frameworks that simplify the process of building web-based applications. Enter OpenWebUI - a groundbreaking framework designed to revolutionize the way we create responsive

What is openWebUi?

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs.

If your running ollama, it will be listening on port 11434 in your local machine, So if you want to have a conversation to any LLM models you can query from terminal or just send query to 11434. That process was made simple by openwebUi it will connect that bridge and let you to have a conversation with Ollama models.

If you are wondering what is OLLAMA check my post about OLLAMA

How to install?

There are multiple ways to install open web UI. Lets check them one by one.

  • If you want install only openwebUi use the docker command below

  • If you want install only openwebUi + if you already have Ollama in your system use the docker command below

  • If you want install only openwebUi + With GPU Support + Ollama use the docker command below

  • If you want install only openwebUi + With CPU Support + Ollama use the docker command below

  • If you want install only openwebUi + Ollama + Helm use the docker command below

After installation, we should be able see this UI. i will leave the exploration of UI to you.

Is this everything? Obviously "No, this just a tip of the iceburg ". OpenwebUi Bundled with features cant be explained in detail in a blog. checkout the official documentation https://docs.openwebui.com