Тёмный

Run Your Own Local ChatGPT: Ollama WebUI 

NeuralNine
Подписаться 348 тыс.
Просмотров 58 тыс.
50% 1

Today we learn how we can run our own ChatGPT-like web interface using Ollama WebUI.
Ollama: github.com/ollama/ollama
Ollama WebUI: github.com/ollama-webui/ollam...
◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾
📚 Programming Books & Merch 📚
🐍 The Python Bible Book: www.neuralnine.com/books/
💻 The Algorithm Bible Book: www.neuralnine.com/books/
👕 Programming Merch: www.neuralnine.com/shop
💼 Services 💼
💻 Freelancing & Tutoring: www.neuralnine.com/services
🌐 Social Media & Contact 🌐
📱 Website: www.neuralnine.com/
📷 Instagram: / neuralnine
🐦 Twitter: / neuralnine
🤵 LinkedIn: / neuralnine
📁 GitHub: github.com/NeuralNine
🎙 Discord: / discord

Наука

Опубликовано:

 

13 фев 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 58   
@joekustek2623
@joekustek2623 Месяц назад
When I open the webui in browser and log in there are no models and I have ollama installed with llama2. how do I get my models to show up in the web ui? Thanks
@benjaminbalassa9096
@benjaminbalassa9096 Месяц назад
I struggle with the same issue :(
@alexanderv851
@alexanderv851 4 месяца назад
Awsome content as always, mr. Maximilian
@Jenko022
@Jenko022 4 месяца назад
Thank you for this, its fantastic. Would you be able to demonstrate installing a local llm to query your own documents? I have came across a number of tutorials for this but had not success running.
@louiscklaw
@louiscklaw 4 месяца назад
Hi, can you share which video card you are using for this demo ?
@BlueBearOne
@BlueBearOne 2 месяца назад
Awesome. Thank You
@ikik1648
@ikik1648 2 месяца назад
Hey so I’m trying to write an Alexa task that will provide a conversational UI w/ offline LLM (my usecase is crisis relief workers in areas with limited / downed connectivity). Would the VoiceGPT extension work with Ollama WebUI? Also, is there a risk rating for wrong results for the lighter 2B or less models?
@cesoirg
@cesoirg 4 месяца назад
Love it! Do you mind sharing the hardware list of your desktop/laptop running llama2 ? The speed looks great in your demo. Thanks!
@NeuralNine
@NeuralNine 4 месяца назад
32GB of RAM, AMD Ryzen 7 5800, Nvidia GeForce RTX 3060 Ti, SSD Hard Drive
@KevinArikkatt
@KevinArikkatt 4 месяца назад
@@NeuralNine we got similar specs just that urs is desktop and mine is laptop 😆😅😫☠
@ya5z
@ya5z 2 месяца назад
@@NeuralNine Iam a laptop and the spec: 8gb of ram , intel i5 10210, no Gpu (🙂), 256 ssd
@anshulsingh8326
@anshulsingh8326 Месяц назад
how did you download and where did you placed the model? For me its blank since i didn't downloaded any model
@dragonsage6909
@dragonsage6909 4 месяца назад
Cool, thx :)
@amanreddypundru8933
@amanreddypundru8933 3 месяца назад
hi can we able to deploy this model with UI on any platforms like github or smtng else
@AlirezaMirhabibi
@AlirezaMirhabibi 2 месяца назад
Very Useful, I want to setup an LM like this on my own hp g8 server to use on my other python project to generate descriptions automatically. Is there any way to connect Ollama to my Python project (need to use Ollama API)?
@devagarwal3250
@devagarwal3250 4 месяца назад
pls make a videoon how can we fine tune oss model
@rodneiaguiar22
@rodneiaguiar22 2 месяца назад
Thanks!
@Dz-Hub-ll5tr
@Dz-Hub-ll5tr 4 месяца назад
Thank for sharing 👍, …same installation steps to set up on cloud instances….? ? 👨🏽‍💻
@ghostandry8789
@ghostandry8789 Месяц назад
Hey there, im asking if i can remove the register button because mine its a private ai and i dont wanna other people using my pc as ai. can you help me?
@joekustek2623
@joekustek2623 Месяц назад
How did you get the models to show when you opened it Mine is completely empty. You did not show how you did that.
@user-tl9wq8nv9w
@user-tl9wq8nv9w 3 месяца назад
does this work with azure open ai api?
@Noobinski
@Noobinski 2 месяца назад
I am new to Docker, containers and linux commads. Please help me in converting from Ollama using CPU (I had an AMD GPU) to Nvidia, to which I switched after installing Ollama and the models... atm it still uses CPU only. Any ideas?
@Electrox3d
@Electrox3d 3 месяца назад
Mine looks different, it installed and the icon is OI, not Llama. I can't load llama LLMs. hmm...
@MrStellateWaffle
@MrStellateWaffle 3 месяца назад
Where is the link to the docker website?! How am I supposed to do anything if the link isn't even there?!
@Al_Miqdad_
@Al_Miqdad_ 4 месяца назад
hello can I make this control my network and ask him about it to get the information localy
@RakibHasan-hs1me
@RakibHasan-hs1me 4 месяца назад
Good Thinking
@CodeWithArpitTech
@CodeWithArpitTech 4 месяца назад
good
@daveys
@daveys 4 месяца назад
Maximillian, cool name.
@NeuralNine
@NeuralNine 4 месяца назад
Just not mine :D
@daveys
@daveys 4 месяца назад
@@NeuralNine - That’s hallucination for you
@guy.incognito
@guy.incognito 3 месяца назад
@@NeuralNine classic maximilian!
@yasiruperera587
@yasiruperera587 2 месяца назад
can you do a video to how to install it one computer and acess it via wifi
@mlg4035
@mlg4035 4 месяца назад
Windows version is out now.
@Aquaa-nL
@Aquaa-nL 2 месяца назад
Is Ollama better than Jan? It seems like no one is talking about it
@dipeshsamrawat7957
@dipeshsamrawat7957 Месяц назад
How to train this local AI on my dataset???
@1457Davi
@1457Davi 4 месяца назад
If I run it on wsl, Can I acess on windows?
@anuraagkhare1995
@anuraagkhare1995 4 месяца назад
yes
@damadorpl
@damadorpl 4 месяца назад
yes it works on WSL but beeter to run via docker on windows - better performacne
@giovannicordova4803
@giovannicordova4803 4 месяца назад
How good is Ollama compared to GPT-4?
@NeuralNine
@NeuralNine 4 месяца назад
Ollama itself is just the app / platform. It depends on the model you use. Check out the LMSYS leaderboard for a comparison
@giovannicordova4803
@giovannicordova4803 4 месяца назад
Thank you@@NeuralNine
@NextGenSellPOS
@NextGenSellPOS 4 месяца назад
how to use with python
@tacorevenge87
@tacorevenge87 2 месяца назад
What do you mean? Do you want to call ollama models from python?
@MartinGaertner
@MartinGaertner 2 месяца назад
works not! when i install ollama and then i use docker with open-webui all works god. the ollama on my terminal works perfect. and the docker with open-webui works. now tell me what mus we do for using llama2:latest it is not usable in open-webui! i can nothing see the llama2 or other! this step have you not in the video, that sucks!
@Horatius_444
@Horatius_444 Месяц назад
very bad in explaining
@HiddenInformation10
@HiddenInformation10 3 месяца назад
malisimo tutorial. No me interesa tu vida o que enseñes LO QUE YA HICISTE. MOSTRA COMO SE INSTALA de cero o ni te molestes en poner a grabar la pantalla y tu camara
@podunkman2709
@podunkman2709 4 месяца назад
Guys, why do you show us such things? What's the point of using this software locally on a PC if there are professional services on the market such as GPT or Gemini? Who in their right mind would install this on their computer for such purposes? Show us something that MAKES SENSE. For example, how to build a knowledge base using this model. How to search a local database. How to create a search engine for content in documents... other. I would have to lose my mind to replace GPT with Ollama to use it as a chatbot.
@Andy-cr2nn
@Andy-cr2nn 4 месяца назад
Because for organizations that have confidential documents you can’t simply plug them into chatGPT or Gemini. This is a solution if you want to have an internal LLM thats private that can interact with your own private data.
@graphguy
@graphguy 4 месяца назад
Are you 12? There are tons of reason for localization. Start with security.
@user-gn7oi4wi2n
@user-gn7oi4wi2n 3 месяца назад
I am struggling to create knowledge base using these models. Any Good guide ?
@ArthurMartins-jw8fq
@ArthurMartins-jw8fq 3 месяца назад
Are u dumb? 😂😂😂
@tsoupakis
@tsoupakis 3 месяца назад
@@graphguythey probably trolling..
Далее
Run your own AI (but private)
22:13
Просмотров 1,2 млн
когда повзрослела // EVA mash
00:40
Просмотров 1,1 млн
Самоприкорм с сестрой 😂
00:19
Просмотров 107 тыс.
LISA - ROCKSTAR (MV Teaser)
00:10
Просмотров 6 млн
host ALL your AI locally
24:20
Просмотров 778 тыс.
Use Your Self-Hosted LLM Anywhere with Ollama Web UI
10:03
AI Learns What Pizza Is
8:03
Просмотров 330 тыс.
сюрприз
1:00
Просмотров 1,7 млн
FullHD в 8К БЕЗ ПОТЕРЬ? |РАЗБОР
20:42
РЭДФЛАГИ СБОРЩИКОВ ПК часть 1
1:00