Explore the power of self-hosted language models with us on Easy Self Host! In this video, we demonstrate how to run Ollama with Open WebUI, creating a private server-based environment similar to ChatGPT. We'll guide you through setting up Ollama and Open WebUI using Docker Compose, delve into the configuration specifics, and show how these tools provide enhanced privacy and control over your data. Whether you're using a modest setup or more powerful hardware, see the performance firsthand. Don't miss out on our insights on potential applications beyond chat, like note summarization in Memos. Subscribe for more self-hosted solutions and find the configuration files on our GitHub, linked below!
00:04 Introduction
01:07 Tutorial to run Ollama and Open WebUI (Docker Compose)
03:18 Running Docker Compose on the Server
03:38 Start Chatting on Open WebUI
06:01 Integration with Memos (experimental)
🔗 Links:
Docker Compose file for this video: github.com/easyselfhost/self-...
Ollama: ollama.com
Open WebUI: openwebui.com/
My hack on Memos to support LLM: github.com/usememos/memos/com...
Memos video: • Self host your own Not...
6 май 2024