ChatGPT Alternative: Part 3 - Open Source Large Language Model (LLM) Service Interface

A guide on setting up the OpenWebUI tool as an alternative interface for multi-modal Large Language Model (LLM) usage. The final part of a three parts series for a working remotely accessible ChatGPT alternative.
devops
llm
cloud
ai
containers
https
web
Author
Affiliation

Ashraf Miah

Published

May 17, 2025

Modified

September 3, 2025

The OpenWebUI and Open Router logo on the backdrop of a green mountain and low clouds.

This guide is the conclusion of a three part series to create a working ChatGPT alternative. The series has covered:

Animation of Prompt and Response

The purpose of this guide is the installation and setup of OpenWebUI [1] using Podman [2] as the container orchestrator and Traefik [3] to provide secure web access to the User Interface. Rather than run models on the server itself (a common use case of openwebui), the guide shows how to use external API providers like Open Router [4] to provide convenient access to a range of LLM services.

Curio Data Pro

Curio Data Pro Ltd

This guide was developed by Ashraf Miah, the Curious Data Explorer and a Founder of Curio Data Pro Ltd. We support businesses implement Data Driven Decisions by bringing clarity with Systems Thinking, combining modern DevOps and Systems Integration to deliver end-to-end data workflows. We have a background in both Engineering and Data Science and over 20 years of experience working in a range of sectors from Aerospace, Defence, Rail and Energy.

What is a Large Language Model (LLM) Service?

A Large Language Model (LLM) is a deep learning model with a large (massive!) number of parameters, trained on vast amounts of text data, enabling it to generate not only human-quality text, but extract entities, summarise, and semantically search and match. In practice, services such as OpenAI’s ChatGPT or Claude from Anthropic do not provide direct access to these, but instead build a service around them that seeks to improve the User Experience. For example, LLMs cannot directly consume Microsoft Word documents or the Portable Document Format (PDF) as such, LLM Services offer convenience functions that convert these into a format that the LLMs can consume. Likewise, the responses from the LLM can be formatted into a more readable form e.g. code generated will have syntax highlighting applied, or citations will be formatted as footnotes or other more discrete links. Images, for Visual Language Models can be automatically re-sized or converted to support the models directly.

Why Use OpenWebUI?

Whilst LLM Services such as Claude and ChatGPT offer, slick, featureful interfaces from code development, web design or indepth exploration of a document or deep research on a topic, the user is ultimately restricted to a single service provider albeit access to the majority of models. OpenWebUI offers a similar interface, with similar features but the ability to use an unlimited number of models with any provider that supports the OpenAI API specification. Whilst this isn’t offered natively by rival providers, it is common amongst a range of broker services from the self-hosted LiteLLM [5] to a hosted broker like OpenRouter.

The second key advantage, is that interface allows a user complete control of the “memory” of a chat, and therefore can be edited or manipulated to explore edge cases or to provide better context for follow-up questions. The third, is that multiple models can be used in the same chat to enable a simple means of validating the output of one model with that of an another. Finally, a recent development is the concept of a Model Arena, where the same prompt is provided to multiple models and the user selects the best. Services such as Chatbot Arena [6] are used by some providers to capture feedback from real world usage. However, the prompts and responses are publicly available and such not safe for private data. OpenWebUI offers the same feature without the public disclosure.

Installing OpenWebUI

Following the Quick Start guide for a Docker container utilising Docker-Compose to run the container. This requires the user to have a GitHub Personal Access Token as per the Working with the Container registry [7] documentation. breadNet [8] has more detailed documentation on how to authenticate podman to the GitHub Container Registry (GHCR).

# Set GitHub Personal Access Token (PAT)
export GHCR_TOKEN=<your_personal_access_token>

# Login to the remote registry
echo $GHCR_TOKEN | podman login ghcr.io -u miah0x41 --password-stdin

Login Succeeded!

# Pull test image
podman pull ghcr.io/open-webui/open-web-ui:main

Trying to pull ghcr.io/open-webui/open-web-ui:main...
Error: initializing source docker://ghcr.io/open-webui/open-web-ui:main: reading manifest main in ghcr.io/open-webui/open-web-ui: manifest unknown

Note that pulling the image from the GitHub Container Registry (GHCR) fails as the tag is not found, and therefore either try again or obtain a more specific tag directly from the GitHub repository under Packages.

# Pull image
podman pull ghcr.io/open-webui/open-webui:main

Trying to pull ghcr.io/open-webui/open-webui:main...
Getting image source signatures
Copying blob 4f4fb700ef54 skipped: already exists
Copying blob 9c538fc35491 done   |
Copying blob 2a47a8c4fd5c done   |
Copying blob 47bbb0afa7fe done   |
Copying blob 8a628cdd7ccc done   |
Copying blob 782acb99e453 done   |
Copying blob b7915144f9c7 done   |
Copying blob 633be3c6bab0 done   |
Copying blob 4f4fb700ef54 skipped: already exists
Copying blob 266a80d83771 done   |
Copying blob 7b4a3fa111d1 done   |
Copying blob 693caf783e3a done   |
Copying blob e549cfb1e9e7 done   |
Copying blob 5beb63436aec done   |
Copying blob c1b8d4819be2 done   |
Copying blob fc709d98d8b0 done   |
Copying config 7d52a2d8c2 done   |
Writing manifest to image destination
7d52a2d8c23f4e0a4fb5e3ec51449f1e64a32098fafe9e1c9fc415216927af2e

Prepare the local drive to host the volume and manage secrets via a local environment file:

# Create a folder for the volume
mkdir -p ~/app/openwebui

# Navigate to the folder for hosting the compose file
mkdir -p ~/openwebui && cd ~/openwebui

# Create an empty environment file
touch ~/openwebui/.env

Run the following docker-compose file in a dedicated folder (e.g. openwebui):

docker-compose.yaml
# OpenWebUI Docker Compose file
# https://docs.openwebui.com/getting-started/quick-start/
version: '3'

services:
  openwebui:
    image: ghcr.io/open-webui/open-web-ui:main
    container_name: openwebui
    hostname: chatty-cheetah.curiodata.pro
    ports:
1      - "3000:8080"
    volumes:
2      - open-web-ui:/home/charlie/app/openwebui
    env_file:
3      - .env
    labels:
      - traefik.enable=true
      - traefik.http.routers.openwebui-secure.entrypoints=https
4      - traefik.http.routers.openwebui-secure.rule=Host(`chatty-cheetah.curiodata.pro`)
      - traefik.http.routers.openwebui-secure.tls=true
      - traefik.http.routers.openwebui-secure.tls.certresolver=lets-encrypt
5      - traefik.http.services.openwebui.loadbalancer.server.port=8080

volumes:
  open-web-ui:
1
This is used to map the host port (3000) to the container’s internal port (8080), however it’s redundant as traefik will route directly to the internal port.
2
This maps the container volume to a local directory.
3
Environmental Variables are stored in a local file.
4
Defines the external domain name.
5
This is the internal container port that traefik will route to.

There are broadly two traefik related configurations sets defined in the yaml file. The first is the _HTTP Router, consisting of four labels. Note the name of the router is defined after the traefik.http.routers i.e. openwebui-secure. Likewise, the service is defined after the traefik.http.services i.e. openwebui. Note also the Host definition as chatty-cheetah.curiodata.pro which means we need to manually create a Domain Name Service (DNS) record (typically A for IPv4 or AAAA for IPv6) that points chatty-cheetah.curiodata.pro to the IP address of the VPS.

# Run the compose-file
podman compose up

>>>> Executing external compose provider "/usr/bin/podman-compose". Please refer to the documentation for details. <<<<

podman-compose version: 1.0.6
['podman', '--version', '']
using podman version: 4.9.3
** excluding:  set()
['podman', 'ps', '--filter', 'label=io.podman.compose.project=openwebui', '-a', '--format', '{{ index .Labels "io.podman.compose.config-hash"}}']
podman volume inspect openwebui_open-web-ui || podman volume create openwebui_open-web-ui
['podman', 'volume', 'inspect', 'openwebui_open-web-ui']
['podman', 'network', 'exists', 'openwebui_default']
podman create --name=openwebui --label traefik.enable=true --label traefik.http.routers.openwebui-secure.entrypoints=https --label traefik.http.routers.openwebui-secure.rule=Host(`chatty-cheetah.curiodata.pro`) --label traefik.http.middlewares.openwebui-https-redirect.redirectscheme.scheme=https --label traefik.http.routers.openwebui.middlewares=openwebui-https-redirect --label traefik.http.routers.openwebui-secure.tls=true --label traefik.http.routers.openwebui-secure.tls.certresolver=lets-encrypt --label traefik.http.services.openwebui.loadbalancer.server.port=8000 --label io.podman.compose.config-hash=dddefaf790b0c9b737fbcc36b392bb3d14d3cd47a9ada4485ee635f57cfaddc7 --label io.podman.compose.project=openwebui --label io.podman.compose.version=1.0.6 --label PODMAN_SYSTEMD_UNIT=podman-compose@openwebui.service --label com.docker.compose.project=openwebui --label com.docker.compose.project.working_dir=/home/charlie/openwebui --label com.docker.compose.project.config_files=docker-compose.yaml --label com.docker.compose.container-number=1 --label com.docker.compose.service=openwebui --env-file /home/charlie/openwebui/.env -v openwebui_open-web-ui:/home/charlie/app/openwebui --net openwebui_default --network-alias openwebui -p 3000:3000 --hostname chatty-cheetah.curiodata.pro ghcr.io/open-webui/open-webui:git-852d9dc@sha256:1af9b461fea99b22678ab5e5e183cc7cfd5446e96482674fac97d0397905e1fa
a751f4919db6e2ea431259a1fc9629098e757d151853943644015d47a25b296e
exit code: 0
podman start -a openwebui
[openwebui] | Loading WEBUI_SECRET_KEY from file, not provided as an environment variable.
[openwebui] | Generating WEBUI_SECRET_KEY
[openwebui] | Loading WEBUI_SECRET_KEY from .webui_secret_key
[openwebui] | /app/backend/open_webui
[openwebui] | /app/backend
[openwebui] | /app
INFO  [alembic.runtime.migration] Context impl SQLiteImpl.
INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
INFO  [alembic.runtime.migration] Running upgrade  -> 7e5b5dc7342b, init
INFO  [alembic.runtime.migration] Running upgrade 7e5b5dc7342b -> ca81bd47c050, Add config table
INFO  [alembic.runtime.migration] Running upgrade ca81bd47c050 -> c0fbf31ca0db, Update file table
INFO  [alembic.runtime.migration] Running upgrade c0fbf31ca0db -> 6a39f3d8e55c, Add knowledge table
INFO  [alembic.runtime.migration] Running upgrade 6a39f3d8e55c -> 242a2047eae0, Update chat table
INFO  [alembic.runtime.migration] Running upgrade 242a2047eae0 -> 1af9b942657b, Migrate tags
INFO  [alembic.runtime.migration] Running upgrade 1af9b942657b -> 3ab32c4b8f59, Update tags
INFO  [alembic.runtime.migration] Running upgrade 3ab32c4b8f59 -> c69f45358db4, Add folder table
INFO  [alembic.runtime.migration] Running upgrade c69f45358db4 -> c29facfe716b, Update file table path
INFO  [alembic.runtime.migration] Running upgrade c29facfe716b -> af906e964978, Add feedback table
INFO  [alembic.runtime.migration] Running upgrade af906e964978 -> 4ace53fd72c8, Update folder table and change DateTime to BigInteger for timestamp fields
INFO  [alembic.runtime.migration] Running upgrade 4ace53fd72c8 -> 922e7a387820, Add group table
INFO  [alembic.runtime.migration] Running upgrade 922e7a387820 -> 57c599a3cb57, Add channel table
INFO  [alembic.runtime.migration] Running upgrade 57c599a3cb57 -> 7826ab40b532, Update file table
INFO  [alembic.runtime.migration] Running upgrade 7826ab40b532 -> 3781e22d8b01, Update message & channel tables
INFO  [open_webui.env] 'DEFAULT_LOCALE' loaded from the latest database entry
INFO  [open_webui.env] 'DEFAULT_PROMPT_SUGGESTIONS' loaded from the latest database entry
WARNI [open_webui.env]

WARNING: CORS_ALLOW_ORIGIN IS SET TO '*' - NOT RECOMMENDED FOR PRODUCTION DEPLOYMENTS.

INFO  [open_webui.env] Embedding model set: sentence-transformers/all-MiniLM-L6-v2
WARNI [langchain_community.utils.user_agent] USER_AGENT environment variable not set, consider setting it to identify your requests.
[openwebui] | Creating knowledge table
[openwebui] | Migrating data from document table to knowledge table
[openwebui] | Converting 'chat' column to JSON
[openwebui] | Renaming 'chat' column to 'old_chat'
[openwebui] | Adding new 'chat' column of type JSON
[openwebui] | Dropping 'old_chat' column
[openwebui] | Primary Key: {'name': None, 'constrained_columns': []}
[openwebui] | Unique Constraints: [{'name': 'uq_id_user_id', 'column_names': ['id', 'user_id']}]
[openwebui] | Indexes: [{'name': 'tag_id', 'column_names': ['id'], 'unique': 1, 'dialect_options': {}}]
[openwebui] | Creating new primary key with 'id' and 'user_id'.
[openwebui] | Dropping unique constraint: uq_id_user_id
[openwebui] | Dropping unique index: tag_id
[openwebui] |
[openwebui] |  ██████╗ ██████╗ ███████╗███╗   ██╗    ██╗    ██╗███████╗██████╗ ██╗   ██╗██╗
[openwebui] | ██╔═══██╗██╔══██╗██╔════╝████╗  ██║    ██║    ██║██╔════╝██╔══██╗██║   ██║██║
[openwebui] | ██║   ██║██████╔╝█████╗  ██╔██╗ ██║    ██║ █╗ ██║█████╗  ██████╔╝██║   ██║██║
[openwebui] | ██║   ██║██╔═══╝ ██╔══╝  ██║╚██╗██║    ██║███╗██║██╔══╝  ██╔══██╗██║   ██║██║
[openwebui] | ╚██████╔╝██║     ███████╗██║ ╚████║    ╚███╔███╔╝███████╗██████╔╝╚██████╔╝██║
[openwebui] |  ╚═════╝ ╚═╝     ╚══════╝╚═╝  ╚═══╝     ╚══╝╚══╝ ╚══════╝╚═════╝  ╚═════╝ ╚═╝
[openwebui] |
[openwebui] |
[openwebui] | v0.6.5 - building the best open-source AI user interface.
[openwebui] |
[openwebui] | https://github.com/open-webui/open-webui
[openwebui] |

Access the domain to open OpenWebUI:

Initial OpenWebUI screen

Click on the Get Started link to create the initial and admin user account:

Initial Account Setup

Press the “Create Admin Account” to set the initial details and view the changelog:

Welcome Screen

Currently, no models have been set:set

Missing Models

Create OpenAI Account

As stated previously, by default OpenWebUI deploys the ollama [9] package and the open-source, local Ollama [10] model from Meta (owner of Facebook). This has been disabled (given the lightweight VPS) using the environmental variable ENABLE_OLLAMA_API=False supplied via the local .env file in the docker-compose.yaml definition. The intention is to use existing LLM services from a variety of providers. To test the system initially, we’ll use OpenAI - they currently provide an Application Programming Interface (API) with $5 of free credit.

There are many guides and videos on how to setup a OpenAI Developer Account and obtain an API key that uses the official documentation [11]. Once either logged in or after creating a new account the following dashboard is visible:

OpenAI Platform Dashboard

Click on API Keys:

API Keys

Create new secret key:

Test Key

Generate and save the new key:

Save Key

Configure OpenWebUI

Access the Admin Panel menu by clicking on your username followed by the Settings tab:

Settings Menu Access

Select the Connections tab:

Connections Details

Copy the OpenAI key into the API Key field:

OpenAI Key

Hit Save and then click on Models. The application has automatically detected all the available models and enabled them. However, this can be quite unwieldy for a user, so it’s best to select a few models to test with:

OpenAI Models

To test, click on New Chat in the top left corner and enter a prompt:

Animation of Prompt and Response

Run OpenWebUI at Boot

As outlined in the previous guide, we can orchestrate the openwebui container using systemd, a init and systems manager native to Ubuntu based on Part 2. Firstly, press Ctrl-C to stop the container running in the foreground and remove the existing container as we intend to use the same name: podman rm openwebui.

We’ll use the (now deprecated) podman generate systemd command to create a systemd unit file:

# Generate a systemd unit file for the traefik container
podman generate systemd --name openwebui --new --restart-policy=always > ~/.config/systemd/user/openwebui.service

DEPRECATED command:
It is recommended to use Quadlets for running containers and pods under systemd.

Please refer to podman-systemd.unit(5) for details.
podman generate systemd --name traefik --new --restart-policy=always > ~/.config/systemd/user/traefik.service

DEPRECATED command:
It is recommended to use Quadlets for running containers and pods under systemd.

Please refer to podman-systemd.unit(5) for details.

# Check contents of systemd file
cat ~/.config/systemd/user/openwebui.service

# container-openwebui.service
# autogenerated by Podman 4.9.3
# Wed Apr 23 15:11:48 UTC 2025

[Unit]
Description=Podman container-openwebui.service
Documentation=man:podman-generate-systemd(1)
Wants=network-online.target
After=network-online.target
RequiresMountsFor=%t/containers

[Service]
Environment=PODMAN_SYSTEMD_UNIT=%n
Restart=always
TimeoutStopSec=70
ExecStart=/usr/bin/podman run \
        --cidfile=%t/%n.ctr-id \
        --cgroups=no-conmon \
        --rm \
        --sdnotify=conmon \
        -d \
        --replace \
        --name=openwebui \
        --label traefik.enable=true \
        --label traefik.http.routers.openwebui-secure.entrypoints=https \
        --label traefik.http.routers.openwebui-secure.rule=Host(`chatty-cheetah.curiodata.pro`) \
        --label traefik.http.routers.openwebui-secure.tls=true \
        --label traefik.http.routers.openwebui-secure.tls.certresolver=lets-encrypt \
        --label traefik.http.services.openwebui.loadbalancer.server.port=8080 \
        --label io.podman.compose.config-hash=5cf9439501725afa1039e3d17f32a39c5f17720e661bd3baa662bb85e6d925cc \
        --label io.podman.compose.project=openwebui \
        --label io.podman.compose.version=1.0.6 \
        --label PODMAN_SYSTEMD_UNIT=podman-compose@openwebui.service \
        --label com.docker.compose.project=openwebui \
        --label com.docker.compose.project.working_dir=/home/charlie/openwebui \
        --label com.docker.compose.project.config_files=docker-compose.yaml \
        --label com.docker.compose.container-number=1 \
        --label com.docker.compose.service=openwebui \
        --env-file /home/charlie/openwebui/.env \
        -v openwebui_open-web-ui:/home/charlie/app/openwebui \
        --net openwebui_default \
        --network-alias openwebui \
        -p 3000:8080 \
        --hostname chatty-cheetah.curiodata.pro ghcr.io/open-webui/open-webui:git-852d9dc@sha256:1af9b461fea99b22678ab5e5e183cc7cfd5446e96482674fac97d0397905e1fa
ExecStop=/usr/bin/podman stop \
        --ignore -t 10 \
        --cidfile=%t/%n.ctr-id
ExecStopPost=/usr/bin/podman rm \
        -f \
        --ignore -t 10 \
        --cidfile=%t/%n.ctr-id
Type=notify
NotifyAccess=all

[Install]
WantedBy=default.target

Check the status of the new service:

# Check the status of the traefik service
systemctl --user status openwebui.service
 openwebui.service - Podman container-openwebui.service
     Loaded: loaded (/home/charlie/.config/systemd/user/openwebui.service; disabled; preset: enabled)
     Active: inactive (dead)
       Docs: man:podman-generate-systemd(1)

# Start it
systemctl --user start openwebui.service

# Check status
systemctl --user status openwebui.service

 openwebui.service - Podman container-openwebui.service
     Loaded: loaded (/home/charlie/.config/systemd/user/openwebui.service; disabled; preset: enabled)
     Active: active (running) since Wed 2025-04-23 15:15:11 UTC; 22s ago
       Docs: man:podman-generate-systemd(1)
   Main PID: 8609 (conmon)
      Tasks: 13 (limit: 4540)
     Memory: 4.1M (peak: 15.4M)
        CPU: 679ms
     CGroup: /user.slice/user-1000.slice/user@1000.service/app.slice/openwebui.service
             ├─8596 rootlessport
             ├─8601 rootlessport-child
             └─8609 /usr/bin/conmon --api-version 1 -c 832167da10df24f14a05c577ffa3e3d95e66194b5e2b668124dfee10cdf18a>

Apr 23 15:15:21 chatty-cheetah openwebui[8609]: INFO  [alembic.runtime.migration] Running upgrade 57c599a3cb57 -> 782>
Apr 23 15:15:21 chatty-cheetah openwebui[8609]: INFO  [alembic.runtime.migration] Running upgrade 7826ab40b532 -> 378>
Apr 23 15:15:21 chatty-cheetah openwebui[8609]: INFO  [open_webui.env] 'DEFAULT_LOCALE' loaded from the latest databa>
Apr 23 15:15:21 chatty-cheetah openwebui[8609]: INFO  [open_webui.env] 'DEFAULT_PROMPT_SUGGESTIONS' loaded from the l>
Apr 23 15:15:21 chatty-cheetah openwebui[8609]: WARNI [open_webui.env]
Apr 23 15:15:21 chatty-cheetah openwebui[8609]:
Apr 23 15:15:21 chatty-cheetah openwebui[8609]: WARNING: CORS_ALLOW_ORIGIN IS SET TO '*' - NOT RECOMMENDED FOR PRODUC>
Apr 23 15:15:21 chatty-cheetah openwebui[8609]:
Apr 23 15:15:23 chatty-cheetah openwebui[8609]: INFO  [open_webui.env] Embedding model set: sentence-transformers/all>
Apr 23 15:15:26 chatty-cheetah openwebui[8609]: WARNI [langchain_community.utils.user_agent] USER_AGENT environment v>
lines 1-23/23 (END)

# Enable at boot
systemctl --user enable openwebui.service

Created symlink /home/charlie/.config/systemd/user/default.target.wants/openwebui.service → /home/charlie/.config/systemd/user/openwebui.service.

The container should now attempt to automatically start after a reboot of the host and/or recover from an error.

Access Multiple LLMs with Open Router

The OpenAI developer service permits the user to access multiple LLMs from the same vendor using its API. Typically, the minimum spend is $10 USD but by design limited to the models from the same vendor. If the user attempts to repeat this process with other model providers such as Anthropic [12] or Mistral [13], they have to spend additional $10 USD. This is not ideal for a user who wants to test multiple models from different vendors.

Secondly, the user is limited to the availability of the vendor to serve their needs despite the models being available from multiple providers. This means if the vendor is unavailable, overloaded or simply not responsive the service is unavailable to the user. And finally, tools such as OpenWebUI are limited to using an OpenAI compatible API structure. It is for this reason tools such as the open source LiteLLM [5] and the public Open Router tools were created. They provide potentially access to multiple models from different vendors, with an option of loadbalancing using a common OpenAI compatible interface (API).

This guide will focus on Open Router, which provides access to a wide range of models from multiple providers and access to interesting metrics.

Open Router Account

The site has a simple account sign-up process. The home page shows some of the most popular models available:

Open Router Home Page

Clicking in an individual model shows a list of Providers, their performance and pricing:

Model Providers and Pricing

Due to high demand, Open Router directs traffic (on a temporary basis) to Google Vertex - one could argue is a competitor. This demonstrates the routing logic applied where the most available provider is used to serve users. Note however, the Max Output is limited to 64K tokens relative to 128K from Amazon Bedrock and Anthropic itself.

Open Router uses a credit system based on USD and requires users to top-up their account. These credits are then used to pay the model providers, cover loadbalancing and routing costs. User can also provision their own API keys for a given provider, and Open Router will charge around 5% of the token cost for pure routing. Since the credit system is used to primarily pay the LLM providers, Open Router makes it money from charging users additional fees. These can range from around 7-10% of the top-up, which can seem a bit much. However, in our experience this is offset by how cheap the tokens via an API and the reliability of an externally managed service. Although, LiteLLM is an alternative approach its difficult to emulate the range of models and the experience of Open Router.

OpenWebUI Integration

From the dashboard, click on the user profile and select API Keys:

Open Router API Keys

Follow the Create a Key wizard and enter some appropriate details:

API Key Details

Press Create and copy the details:

API Key

Return to the OpenWebUI Admin Panel and select the Connections tab and Add Connection:

Add Connection

Note the URL is set to https://openrouter.ai/api/v1. Under Model IDs, select the model identifiers from the Open Router site. For example, anthropic/claude-3-haiku and anthropic/claude-3-sonnet:

Select Model IDs

Add the desired models individually and then save the configuration:

Selected Models

Return to the home screen to see the new models added:

Additional Models

Conclusion

This guide showed how to deploy the OpenWebUI application using Podman and Traefik to provide a secure, reverse proxy connection as a frontend to a variety of LLMs. We also showed how to integrate multiple LLMs using the Open Router service to provide a more featureful and flexible LLM Service. o # Version History {.unnumbered .unlisted}

  • 2025-05-17 - Original
  • 2025-05-26 - Image updates, RI introduction and text corrections.
  • 2025-08-28 - Removed old branding and added CDP assets.

Attribution

Images based on:

Images used in this post have been generated using multiple Machine Learning (or Artificial Intelligence) models and subsequently modified by the author.

Where ever possible these have been identified with the following symbol:

AI Generated Image Symbol
Back to top

References

[1]
Open WebUI, “Open WebUI,” Open WebUI. Available: https://www.openwebui.com. [Accessed: Apr. 13, 2025]
[2]
“Podman.” Available: https://podman.io/. [Accessed: Mar. 22, 2025]
[3]
Traefik Labs, “Run APIs Easily. Anywhere. Traefik Labs,” Run APIs Easily. Anywhere. Traefik Labs. Available: https://traefik.io/. [Accessed: Mar. 22, 2025]
[4]
OpenRouter, Inc, OpenRouter,” OpenRouter. Available: https://openrouter.ai. [Accessed: May 17, 2025]
[5]
Berrie AI Incorporated, LiteLLM.” Available: https://www.litellm.ai/. [Accessed: Apr. 23, 2025]
[6]
“Chatbot Arena (formerly LMSYS): Free AI Chat to Compare & Test Best AI Chatbots.” Available: https://gradio.app/. [Accessed: May 17, 2025]
[7]
GitHub Inc, “Working with the Container registry,” GitHub Docs. Available: https://docs-internal.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-container-registry. [Accessed: Apr. 22, 2025]
[8]
Bradley Jenkins, “Authenticate Podman to GitHub Container Registry - breadNET Documentation,” breadNET Documentation. Jan. 2025. Available: https://documentation.breadnet.co.uk/kb/podman/authenticate-podman-to-ghcr/. [Accessed: Apr. 22, 2025]
[9]
Ollama, “Ollama.” Available: https://ollama.com. [Accessed: Apr. 22, 2025]
[10]
Meta, “Llama,” Llama. Available: https://www.llama.com/. [Accessed: Apr. 22, 2025]
[11]
Open AI, “Overview - OpenAI API.” Available: https://platform.openai.com. [Accessed: Apr. 22, 2025]
[12]
Anthropic, “Home \ Anthropic.” Available: https://www.anthropic.com/. [Accessed: Apr. 23, 2025]
[13]
MIstral AI, “Mistral AI Frontier AI in your hands.” Available: https://mistral.ai/. [Accessed: Apr. 23, 2025]

Citation

BibTeX citation:
@online{miah2025,
  author = {Miah, Ashraf},
  title = {*ChatGPT* {Alternative:} {Part} 3 - {Open} {Source} {Large}
    {Language} {Model} {(LLM)} {Service} {Interface}},
  date = {2025-05-17},
  url = {https://blog.curiodata.pro/posts/14-openwebui-openrouter/},
  langid = {en}
}
For attribution, please cite this work as:
A. Miah, “*ChatGPT* Alternative: Part 3 - Open Source Large Language Model (LLM) Service Interface,” May 17, 2025. Available: https://blog.curiodata.pro/posts/14-openwebui-openrouter/