Тёмный
No video :(

Ollama Web UI 🤯 How to run LLMs 100% LOCAL in EASY web interface? (Step-by-Step Tutorial) 

Mervin Praison
Подписаться 39 тыс.
Просмотров 25 тыс.
50% 1

Опубликовано:

 

23 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 50   
@matthewbond375
@matthewbond375 7 месяцев назад
Thank you! Got Dolphin Mixtral 8x7b going on my gaming laptop w/ GPU support! This stuff is wild!
@edwardleyco9880
@edwardleyco9880 4 месяца назад
How'd you do that? I'm trying to add gpu support on mine but it wont work.
@matthewbond375
@matthewbond375 4 месяца назад
@@edwardleyco9880 That's a lot to put in an answer, but I had no trouble with Ollama detecting my nvidia gpu without tinkering. I can't speak on Windows. Make sure you have the Cuda toolkit installed as well.
@bernardobrito09
@bernardobrito09 3 месяца назад
@@edwardleyco9880 same
@eroshoxhallari4284
@eroshoxhallari4284 4 месяца назад
Perfect! Simple! Easy! 10/10👍
@lifepath7741
@lifepath7741 25 дней назад
Fuck your smart brain!!!!!!!!!!!!!
@JepserUllmann
@JepserUllmann 7 месяцев назад
Thank you, Mervin. Very helpful video. 👍 Keep going! 🚀
@MervinPraison
@MervinPraison 7 месяцев назад
Thank you
@markyoung01maccom
@markyoung01maccom 8 месяцев назад
Brilliant video, well done. Subscribed immediately
@TheRaginghalfasian
@TheRaginghalfasian 5 месяцев назад
i had a lot of problems and frustrations trying to get this working... but now im getting installed by using an older version of debian, the new one keeps having too many problems for me
@lifepath7741
@lifepath7741 25 дней назад
Stupid great teachers....I agree with you 100%
@surajthakkar3420
@surajthakkar3420 8 месяцев назад
Hello Mervin, thank you for the amazing tutorial. Just wanted to check if this would work on windows. Could you please confirm?
@diogenesthecynic9951
@diogenesthecynic9951 6 месяцев назад
never mind I figured it out... a docker issue but thanks! Great Video
@priyacrypto
@priyacrypto 5 месяцев назад
Hi Marvin Great video, thanks for this. Would you be able to provide some info on how to extend the openwebui (add graphs/charts etc), and also does it keep any logs for users, thumbsups etc
@user-uv3nv2bc6v
@user-uv3nv2bc6v 6 месяцев назад
SUPER great video! Question: Can I modify the look and feel of Olama WebUI? Can I add my own RAG Code, to work with my local files?
@codelucky
@codelucky 3 месяца назад
How do I enable GPU? Since I have an RTX 4080 chip.
@palashmandal7703
@palashmandal7703 6 дней назад
How to host the same in AWS instance
@faiqarsheikh3220
@faiqarsheikh3220 Месяц назад
which command prompt did you open
@chaithanyavamshi2898
@chaithanyavamshi2898 9 месяцев назад
Wow! Very Helpful and Valuable content... I followed the same instructions but had issues installing on Windows even with Docker. Can you guide me if there are more resources or please make a video on how to install Ollama Web UI on Windows? Because most of the users are on Windows
@HyperUpscale
@HyperUpscale 8 месяцев назад
YES, to use in Windows you have to use WSL ;) Don't use Docker - use WSL - it is a built in capability to run linux within Windows.
@saintsscholars8231
@saintsscholars8231 7 месяцев назад
Great video, thanks !
@kritsana6170
@kritsana6170 4 месяца назад
Thank you.
@AlloMission
@AlloMission 9 месяцев назад
This is huge!
@hotmonitor
@hotmonitor 4 месяца назад
How to uninstall large llm file ? where is directory for LLM ?
@Nexus3NL
@Nexus3NL 4 месяца назад
Is it possible to talk with ollama in your own native language?
@jackflash6377
@jackflash6377 3 месяца назад
why use docker?
@diogenesthecynic9951
@diogenesthecynic9951 6 месяцев назад
how would I increase the available RAM allocation for the Ollama setup? i Have a 32GB system running on ubuntu desktop linux and docker desktop.
@Jeganbaskaran
@Jeganbaskaran 9 месяцев назад
Could you please let me know how to load other data sources other than files. ex :webapi, database etc?
@ajmalbakhshiamirpoor1343
@ajmalbakhshiamirpoor1343 6 месяцев назад
for me the server is running and when i surf to localhost:3000 i get to login but then i get a blank screen. do you have any idea why i get this issue?
@Bigjuergo
@Bigjuergo 6 месяцев назад
I want to run it on Android phone, how?
@brolendario
@brolendario Месяц назад
ngrok share your localhost
@nufh
@nufh 9 месяцев назад
I'm a windows user, so I can't test it yet. Can we use TTS to converse with it?
@RedSpiritVR
@RedSpiritVR 7 месяцев назад
Now I just need to be able to run this over wifi so I can use it on my phone while my pc is running. Ok so ive got Ollama running, but it takes like 10 minutes for the ai to respond to my queries. Did I set up something wrong?
@codelucky
@codelucky 3 месяца назад
Enable GPU and use ngrok to host it as a server to use it from your phone.
@WSH3TM
@WSH3TM 8 месяцев назад
mine just runs for a while but doesnt output anything. The loading blur thing just shows like at 3:48
@MervinPraison
@MervinPraison 8 месяцев назад
Please make sure you have connected to ollama server url correctly as described in the video(Settings). Also download the model by running ollama run mistral, to download mistral model.
@hawks900
@hawks900 9 месяцев назад
How can i deploy it to use it via a url?
@mistercakes
@mistercakes 9 месяцев назад
do you know if anyone has tried to replicate the function calling feature from openai? (but with the local model like mistral)
@mistercakes
@mistercakes 9 месяцев назад
I found a way to do it, will try to do a write up. :)
@prophetsamuel8570
@prophetsamuel8570 7 месяцев назад
whats the vga card
@eneso8657
@eneso8657 9 месяцев назад
is it possible for public ?
@user-mv5ly6ps5t
@user-mv5ly6ps5t 6 месяцев назад
Bees and those that consume the honey, remember one thing. if its not consistent, whats the point?
@MAzurekkk1
@MAzurekkk1 6 месяцев назад
step by step but for which system? you cut all the elements that could help identify the operating system. we are not blind, I think no one needs a zoom 500% on the terminal. the video is rather of the better ones but it burned my retina from too visible pixels .
@SAVONASOTTERRANEASEGRETA
@SAVONASOTTERRANEASEGRETA 8 месяцев назад
Why isn't it available for Windows?
@AINMEisONE
@AINMEisONE 7 месяцев назад
Fake does not work. I tried so many times these instructions and always errors. I swore that for 1 second I saw the app work, but when you try to follow the instructions here nothing works. Verbatim I tried to get this to work..
@anonymousCAT420
@anonymousCAT420 6 месяцев назад
Just because it won't work for you doesn't mean it is fake 🤦‍♂️🤦‍♂️🤦‍♂️
@nickalika
@nickalika 3 месяца назад
⚠ MAC ⚠ ⛔cd ollama-webui ✅cd open-webui
Далее
host ALL your AI locally
24:20
Просмотров 999 тыс.
Python RAG Tutorial (with Local LLMs): AI For Your PDFs
21:33
#JasonStatham being iconic
00:38
Просмотров 302 тыс.
Simple Flower Syrup @SpicyMoustache
00:32
Просмотров 1,7 млн
All You Need To Know About Running LLMs Locally
10:30
Просмотров 141 тыс.
FREE Local LLMs on Apple Silicon | FAST!
15:09
Просмотров 161 тыс.
Use Your Self-Hosted LLM Anywhere with Ollama Web UI
10:03
Ollama UI - Your NEW Go-To Local LLM
10:11
Просмотров 110 тыс.
I forced EVERYONE to use Linux
22:59
Просмотров 425 тыс.
#JasonStatham being iconic
00:38
Просмотров 302 тыс.