Тёмный

Run Local ChatGPT & AI Models on Linux with Ollama 

KeepItTechie
Подписаться 80 тыс.
Просмотров 52 тыс.
50% 1

Опубликовано:

 

31 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 33   
@LabEveryday
@LabEveryday 8 месяцев назад
Good stuff family! I’m going to try this out!
@KeepItTechie
@KeepItTechie 8 месяцев назад
Fasho fam! Salute!
@walt_the_dolt
@walt_the_dolt 9 месяцев назад
This makes me hopeful for the future of computing. Being able to host these services locally is fantastic.
@KeepItTechie
@KeepItTechie 9 месяцев назад
Absolutely, it's an exciting time for computing! Hosting services locally offers so much control and flexibility. 💻
@hopelessdecoy
@hopelessdecoy 9 месяцев назад
​@@KeepItTechieand you don't have to worry about feeding someone else's model your data!
@darkvertigo
@darkvertigo 8 месяцев назад
Thanks for the content! Great and clear instructions. I was able to get it up and running with my 4060 great. Keep the great vids coming!
@marioandresheviacavieres1923
@marioandresheviacavieres1923 9 месяцев назад
Muchas Gracias Josh! your video was very helpful for me. Thanks!
@Bitnative
@Bitnative 8 месяцев назад
Josh, I followed up on my comment to your twitter post for this video. I got ollama containerized and have added the breezy golang UI to it. I still need to setup docker GPU passthru in order to speed it up. The tokens come real slow right now. LOL My next step is to train it with my own data. This video is good stuff bro, thanks.
@KeepItTechie
@KeepItTechie 8 месяцев назад
Awesome! Thank you!
@jeffrisdon2803
@jeffrisdon2803 9 месяцев назад
Very Cool! Thank You! I added a Tesla p4 card and passed it through in proxmox to a debian VM. works very well!
@DavidOgletree
@DavidOgletree 9 месяцев назад
How much can you train a model?
@techienoir7614
@techienoir7614 9 месяцев назад
Just installed it. Thanks bro!
@KeepItTechie
@KeepItTechie 9 месяцев назад
No problem 👍
@danwaterloo3549
@danwaterloo3549 8 месяцев назад
THanks! This is very useful. I’m going to try this out.
@KeepItTechie
@KeepItTechie 8 месяцев назад
Glad it was helpful!
@d3mist0clesgee12
@d3mist0clesgee12 4 месяца назад
great stuff. bro,
@eriksmith1280
@eriksmith1280 3 дня назад
Can you run any of these models on a Live Kali Linux USB bootable thumb-drive ?
@Dapper_Danny
@Dapper_Danny 2 месяца назад
if anyone is having issues connecting to your webui, make sure you have port 8080 open on your linux machine
@UrRealestCritic
@UrRealestCritic 6 месяцев назад
Good content bro 😎
@SailD-tt5gx
@SailD-tt5gx 8 месяцев назад
How to get model offline , I don’t have internet connection to my Linux , is there anyway to get any model like llama2 manually using a zip offline ?
@facistmonk
@facistmonk 9 месяцев назад
if i ignore the speed / slow issues, can we run any model on any hardware
@KeepItTechie
@KeepItTechie 9 месяцев назад
While you can technically run any model on any hardware, the real limitation comes from the hardware's capabilities, especially for complex models where powerful hardware is essential for practical execution times. Always match your model's requirements with your hardware's capabilities for the best experience.
@chasbear2148
@chasbear2148 9 месяцев назад
Thank God for that, I finaly got one to work out of the box sucessfully
@ChaosReignShow
@ChaosReignShow 9 месяцев назад
Shout out too you Keep it tech all well?
@KeepItTechie
@KeepItTechie 9 месяцев назад
Thanks bro!
@ChaosReignShow
@ChaosReignShow 9 месяцев назад
@@KeepItTechie yeah
@TheBuildersTable
@TheBuildersTable 9 месяцев назад
Keepittechie. 🫡! THAGOD°♤°GOLD•P
@KeepItTechie
@KeepItTechie 9 месяцев назад
Gold P!! Salute bro! Thanks for stopping by!
@jackal6902
@jackal6902 Месяц назад
It’s not local 😂
@pradeepkumarkumar7230
@pradeepkumarkumar7230 8 месяцев назад
Too complicated😂
@ZXGAMER22
@ZXGAMER22 9 месяцев назад
No wonder newbies leave Linux it's too complicated
@hopelessdecoy
@hopelessdecoy 9 месяцев назад
I don't think running LLMs is a good gauge for how an operating system complexness is. There's one called GPT4all and it was just a one click install, one click model download. I'm pretty happy on Linux Mint, it's as complex as you want it to be!
@mikerollin4073
@mikerollin4073 8 месяцев назад
Every Linux user was a noob once. Some noobs quit...cuz they're not persistent and that's it
Далее
host ALL your AI locally
24:20
Просмотров 1,2 млн
Run your own AI (but private)
22:13
Просмотров 1,6 млн
МЖ. Может, папа - ты? 16.02.2023
40:03
Просмотров 340 тыс.
You've been using AI Wrong
30:58
Просмотров 539 тыс.
NEVER install these programs on your PC... EVER!!!
19:26
Coding a FULL App with AI (You Won't Believe This)
15:19
Linux for Beginners
26:32
Просмотров 704 тыс.
5 Awesome Linux Terminal Tools You Must Know
23:05
Просмотров 298 тыс.
How to run an LLM Locally on Ubuntu Linux
18:50
Просмотров 20 тыс.
МЖ. Может, папа - ты? 16.02.2023
40:03
Просмотров 340 тыс.