Тёмный

run AI on your laptop....it's PRIVATE!! 

NetworkChuck
Подписаться 4,1 млн
Просмотров 224 тыс.
50% 1

🔥🔥Join the NetworkChuck Academy!: ntck.co/NCAcademy
☕☕ COFFEE and MERCH: ntck.co/coffee
#AI #aiserver #ollama #llama3

Наука

Опубликовано:

 

21 май 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 192   
@WiSPMusic.
@WiSPMusic. 2 месяца назад
I love it! Except for one thing... You might want to mention that 8GB of ram is not nearly enough. It kind of killed my Apple M2.
@user-jn3xc9tv3j
@user-jn3xc9tv3j 2 месяца назад
Your GPU is the issue, not your RAM
@JBB685
@JBB685 2 месяца назад
@@user-jn3xc9tv3jthen why does my ram max out when I’m watching the tasks? I have GPU showing and it barely kicks in
@LetsTalkMusic2
@LetsTalkMusic2 2 месяца назад
​@@user-jn3xc9tv3j ok bro 😂
@TechGameDev
@TechGameDev Месяц назад
@@user-jn3xc9tv3j no the ram
@manoharmeka999
@manoharmeka999 Месяц назад
What configuration is needed to run moderately large models? Mac Studio base is required to run 60% of those models?
@hikari1690
@hikari1690 2 месяца назад
Hey chuck. I've been downloading ollama since 2 days ago. I think I overdosed on caffeine while waiting. And I only drank tea!
@bonniesitessolutions7728
@bonniesitessolutions7728 23 дня назад
tea has more caffeine than coffee!
@n.o.b.s.8458
@n.o.b.s.8458 2 месяца назад
Trying to flirt with a girl at Starbucks by telling her your laptop runs ChatGPT natively: “See? No internet connection! It still works!” Her: what “It works even though I’m not connected to WiFi!” Her: why aren’t you connected? Don’t you want to use the internet? “…” “This took 4 hours to set up properly can you just say it’s cool?”
@SaiGuy_
@SaiGuy_ 2 месяца назад
Is this ai generated
@Picla_Peremohy
@Picla_Peremohy 2 месяца назад
@@SaiGuy_ No, the girl he was flirting with was
@ThatAverageMTBer
@ThatAverageMTBer 2 месяца назад
@@Picla_Peremohy LMAO
@nomadshiba
@nomadshiba 2 месяца назад
it takes 5mins, not 4 hours
@azhuransmx126
@azhuransmx126 2 месяца назад
Flirt and Computers NEVER go mixed with girls bru☠️. 🗿
@loadedmode2885
@loadedmode2885 Месяц назад
your chanel its the best in youtube , i love you dude
@bot-bot
@bot-bot 2 месяца назад
C'mon Chuck, that's olld news. At lest use Llama3
@zrizzy6958
@zrizzy6958 2 месяца назад
it came out just when he released the first video lol (he even mentioned it)
@bot-bot
@bot-bot 2 месяца назад
@@zrizzy6958 only saw the short 😅
@zrizzy6958
@zrizzy6958 2 месяца назад
@@bot-bot guessed that lol
@Ozzy_Axil
@Ozzy_Axil 2 месяца назад
man try the uncensored versions
@DBCooper3
@DBCooper3 29 дней назад
Could you elaborate
@Kwambomb23
@Kwambomb23 2 месяца назад
pulling now lets gooooo will definitely be checking out the full vid mentioned
@akashs4802
@akashs4802 2 месяца назад
I miss "the 8 gb is not enough" statement. Nowadays it's very common.
@AliHabib-Ali
@AliHabib-Ali 2 дня назад
That is fantastic thank you my friend
@abrahamsimonramirez2933
@abrahamsimonramirez2933 Месяц назад
remember to pull the quantisized versions depending on hardware and memory available
@nomadshiba
@nomadshiba 2 месяца назад
if you're using an atomic linux like fedora silverblue. you can use homebrew, or just put it in docker or podman.
@this_is_mac
@this_is_mac 2 месяца назад
Why did it start giving results before you finished asking lmao
@EngrUsmanx
@EngrUsmanx 2 месяца назад
She..... 😂
@bonniesitessolutions7728
@bonniesitessolutions7728 23 дня назад
because he probably has a $3,000 GPU or many of them.
@hplaptop7747
@hplaptop7747 2 месяца назад
This is where GPUS ACTUALLY MATTER..the integrated GPU is kinda slow but a dedicated one is worth a shot.Also 8GB ram minimum for Lamma 3
@bradt5426
@bradt5426 Месяц назад
Whats the ultimate specs for an ollama AI server? Ive seen NCs, but is that the best, under 8k, setup?
@mycelia_ow
@mycelia_ow 24 дня назад
VRAM or system Ram?
@hplaptop7747
@hplaptop7747 23 дня назад
@@mycelia_ow SYSTEM RAM
@mycelia_ow
@mycelia_ow 23 дня назад
@@hplaptop7747 don't LLMs use VRAM though, only using system ram if VRAM is all used up? That's what happens when I run LLMs locally. It'll use all of my 24GB of vram, and then use 20GB+ of my system ram. Vram is magnitudes faster for inference so there's a huge speed difference between the two ram types.
@RichardGrim
@RichardGrim 2 месяца назад
I got obsessed with this the other week but it's kidna hard and I wanted to combine modules bur that don't work make another how to but new
@siggeorn1856
@siggeorn1856 2 месяца назад
So happy this channel exists!
@Kwambomb23
@Kwambomb23 2 месяца назад
YESSSS thank you!
@soldiernumberx8921
@soldiernumberx8921 2 месяца назад
Yea I know last week I integrated it to my BASH chatbot, very sweet I recommend model: mistral 7b
@Its__Favi
@Its__Favi 2 месяца назад
Doesn't that need like 64 gigs of RAM?
@soldiernumberx8921
@soldiernumberx8921 Месяц назад
@@Its__Favi no bro for 7b you need 8gb of ram if you use linux for windows well.. you maybe right.
@tawhidkhondakar731
@tawhidkhondakar731 2 месяца назад
That's amazing
@iteeruzz2006
@iteeruzz2006 Месяц назад
The first thing would have been mentioning the minimum requirements for the installation and smooth experience....but here are the minimum requirements: Intel/AMD CPU supporting AVX512 or DDR5 and 16GB of RAM I think i will stick to the online models that can still run on decade old machines with 4G ram and proper connection speed but great video as always
@TheDiamond872
@TheDiamond872 2 месяца назад
Good to see Ai being decentralized. Obviously your computation power will depend on, well, your computation power xD
@darkblaze40
@darkblaze40 2 месяца назад
Who says AI nerds don't pull?
@marcochaves9543
@marcochaves9543 Месяц назад
You rocks!
@AB-cd5gd
@AB-cd5gd 2 месяца назад
Anyone know a solution so i can run ai and create api for it, like what kind of hosting i need and if you have a cheap one
@taylormann1038
@taylormann1038 2 месяца назад
Found his video a lil over a month a go... Installed my WSL for my window 11 Lenovo and then installed Ollama.. I have a legal chat bot as well a a dolphin unsecured llama...it's pretty neat I can only run w my CPU..lil slow but cool 😎
@TectonicTechnomancer
@TectonicTechnomancer 2 месяца назад
you dont need wsl for this one, it does have a windows version
@taylormann1038
@taylormann1038 2 месяца назад
@@TectonicTechnomancer thanks idk
@freddyhardware840
@freddyhardware840 Месяц назад
Now I'm gonna have an argument with my laptop as well :-D
@MahmoudSabry-wr2im
@MahmoudSabry-wr2im Месяц назад
How about data analysis capabilities? Do you recommend different open source models?
@thebikecrafter8128
@thebikecrafter8128 2 месяца назад
iam following your tutorial on the complete video its great but my AMD Radeon RX6550M is not one of the supported GPU
@Finbar_Monroe
@Finbar_Monroe Месяц назад
WOW! Networkchuck keeps getting better and better
@tigerscott2966
@tigerscott2966 Месяц назад
why would I do that when I can run better Artificial intelligence with a Linux Operating system and it will all be LOCAL too.
@Finbar_Monroe
@Finbar_Monroe Месяц назад
@@tigerscott2966 Because.. IF you have a smart mind, then you would of followed NWC and his fanbase (as well as me) In doing this. Whats the point of commenting if you don't understand his videos?? I wasn't being rude BTW
@tigerscott2966
@tigerscott2966 Месяц назад
@Finbar_Monroe But - I Don't follow anyone else or use social media period. I just watch video, make a comment and keep moving. If seasoned tech guys want to go gaga over artificial intelligence, that's their business . I am used to being insulted by tech guys. It's the same story - a little knowledge, A few degrees or certifications and a nice salary always leads to the inflated ego and the azz on the shoulder.
@Finbar_Monroe
@Finbar_Monroe Месяц назад
@@tigerscott2966 well dont watch then if you dont know. common sense mate (something your lacking!!)
@tigerscott2966
@tigerscott2966 Месяц назад
@@Finbar_Monroe Only a Chump would insult a stranger online just because his opinion is different from yours. Dallas is not that big..WE can settle this like gentlemen. After I kick your tail, I can post photos of you begging for coffee!
@skata100
@skata100 2 месяца назад
How am I gonna generate images on my laptop which has no gpu
@TLOZ1986
@TLOZ1986 2 месяца назад
I use ollama with llama3
@user-vg5fw2vq7y
@user-vg5fw2vq7y Месяц назад
I need assistance on setting the host for the Ollama app in Droidify on my Android.
@QuickTechNow
@QuickTechNow 2 месяца назад
Only 1 to 6B models run well. I have a 16GB ram model.
@gamereditor59ner22
@gamereditor59ner22 2 месяца назад
Thank you! Can I change the name?🤔
@airbus5717
@airbus5717 Месяц назад
The problem with ollama default models is that it downloads a lower quality quantized model
@jayeshsawant9483
@jayeshsawant9483 17 часов назад
I am trying to run llama3.1:8B on my laptop. I want to save internal space and tun this model on external drive. How can I go about this?
@WeDevin
@WeDevin 2 месяца назад
Hopefully yall running 64 gb ram.
@natanbk5
@natanbk5 Месяц назад
They forgot bilandn
@NathanLundmark
@NathanLundmark 2 месяца назад
nice
@Kyler2K
@Kyler2K 14 дней назад
i did this months ago & now it’s having timeout errors when pulling manifest. any helpers?
@JustAGuyYT1
@JustAGuyYT1 28 дней назад
Got this error "error llama runner process has terminated signal killed" how to fix? I am on a laptop with Linux mint btw
@bonniesitessolutions7728
@bonniesitessolutions7728 23 дня назад
ask chatgpt
@moonandastar
@moonandastar 2 месяца назад
@sargismartirosyan9946
@sargismartirosyan9946 2 месяца назад
But they dont run on veryblow cpu and ram pls make a video how to install other private llms or others that can work on low cpu and ram plsssss😊
@MpaYn
@MpaYn 2 месяца назад
use wizard vicunda uncensored
@bikashpandeya69
@bikashpandeya69 2 месяца назад
Can I use this in my app??
@harshalkukade8664
@harshalkukade8664 Месяц назад
Image ppt processing possible?
@RealmoftheBlackShadow
@RealmoftheBlackShadow 2 месяца назад
No Internet required?
@optomecanic5320
@optomecanic5320 20 дней назад
In windows terminal we download 🤔
@MiniFrenzy970
@MiniFrenzy970 2 месяца назад
Alredy have it lol
@laurojimenez9541
@laurojimenez9541 2 месяца назад
Hey Chuck what do you think about Microsoft new feature called Recall that will record everything you do on your computer.
@BtaraDev
@BtaraDev Месяц назад
Sounds cool. But we all know how it will end.
@phdtvproductions
@phdtvproductions 2 месяца назад
What about Suno AI!?
@gbmillergb
@gbmillergb 2 месяца назад
chatGPT just grabs stuff off the internet.
@AyushPokhariya-bz4ux
@AyushPokhariya-bz4ux 2 месяца назад
Nope, there are data servers from which chatgpt gets its info. And when online, it's connected to those data servers.
@godnessy
@godnessy Месяц назад
You are giving good info but why say things like "like chatgpt"? you cannot run GPT locally.
@user-hp6qb6ds9w
@user-hp6qb6ds9w 2 месяца назад
😎❤😊
@BamBam_101
@BamBam_101 Месяц назад
I CAN MAKE TERMINATOR
@MuhammadRafay-we9nc
@MuhammadRafay-we9nc 2 месяца назад
After downloading it my Nokia mobile is hanged.
@ricky4898
@ricky4898 28 дней назад
32 GB Ram & RTX 2070, llama 8 cranks the heat for barely anything. You’ll need some serious hardware! 😅
@ΚΟΒΙ
@ΚΟΒΙ 2 месяца назад
Btw runs bad on m1 better on rtx 3060
@cheezydragon6908
@cheezydragon6908 2 месяца назад
my gpu doesnt support ollama😭
@Macatho
@Macatho Месяц назад
Yeah but no... To quantize the model down so you can run with with a decent token/s speed... it's going to be pretty bad.
@Spunkbubble1638
@Spunkbubble1638 2 месяца назад
Fuck me you are great at telling us about things…that have existed for a LONG LONG TIME
@keerthes
@keerthes 2 месяца назад
Thanks it's ollama not ms recall
@itsyourboii6153
@itsyourboii6153 2 месяца назад
U can already do this
@nomadshiba
@nomadshiba 2 месяца назад
what's your point?
@Dr.Mikasa_A536
@Dr.Mikasa_A536 Месяц назад
Can 4gb laptop run it?
@offtothenextadventure
@offtothenextadventure 2 месяца назад
What are the disadvantages of this? Security issues?
@DaSoulWizard
@DaSoulWizard 2 месяца назад
Crap AI. Llama was way worse than gpt 3.0
@offtothenextadventure
@offtothenextadventure 2 месяца назад
@@DaSoulWizard really? Is it because the data is limited?
@bizarreposeart8916
@bizarreposeart8916 2 месяца назад
You need a powerful gpu (or it will be slow as hell)
@rahulmistry5019
@rahulmistry5019 2 месяца назад
dont want to generate traffic for them; they'll make free money.
@bradley5008
@bradley5008 2 месяца назад
I tried using it and it flat out refused to answer anything I said. By anything I mean from the Bible to the climate to Atlas Shrugged
@brettu534
@brettu534 2 месяца назад
Did you say you learned how from your own video?
@fluffy280
@fluffy280 2 месяца назад
Useless it's generation is very slow without more gpu power
@Matrixred11
@Matrixred11 2 месяца назад
It's slow
@shapelessed
@shapelessed 2 месяца назад
It isn't. You just have too little VRAM, which means the models go to your system RAM, which is awfully slow (and most likely too small, so it also goes into swap)
@AmjadHussain-ge6eq
@AmjadHussain-ge6eq 2 месяца назад
Hi I want to learn to hack any phone, with the persons permission
@jayeshrajput9984
@jayeshrajput9984 3 дня назад
This is so stupid to come watch hai try... Rather Come watch and Forget...😂 Chuck is just an Actor not an expert...😂
@SriBathran
@SriBathran 2 месяца назад
Does it need Server What is GPU
@shapelessed
@shapelessed 2 месяца назад
Yes it does. It's called a computer. Not sure. Never heard of graphics procrssing units before in my life.
@youngcitybandit
@youngcitybandit 2 месяца назад
A server is a computer. You laptop is effectively a server if you want it to be. If you mean you need to connect to a server, no because its able to run offline
@Americaforthemoney
@Americaforthemoney Месяц назад
Bro it's so lame like you need a really good laptop for it it can take forever
@danvasii9884
@danvasii9884 2 месяца назад
can be done in windows too?
@ziemniaczki
@ziemniaczki 2 месяца назад
yes
@taylormann1038
@taylormann1038 2 месяца назад
U have set up your WSL and run it in Linux... He has a video on his page...I set mine up about a month ago w my windows 11 Lenovo...
@danvasii9884
@danvasii9884 2 месяца назад
@@taylormann1038 Thanks
@impostorsyndrome1350
@impostorsyndrome1350 2 месяца назад
It can't fo nsfw images so it's sadly useless
@renovacio5847
@renovacio5847 2 месяца назад
Like ChatGPT?.. you mean the 175B parameter one ? XD.. maybe a 2B gemma
@rhbrolotek
@rhbrolotek 2 месяца назад
What do you mean ? The 175B which one ?
@renovacio5847
@renovacio5847 2 месяца назад
@@rhbrolotek The chatGpt which has 175B parameters. Which other model has 175B parameters?
@Crush_Lazy
@Crush_Lazy 2 месяца назад
Can i download it my phone!?
@nomadshiba
@nomadshiba 2 месяца назад
you can connect to your web ui using your phone, but no can't run it on your phone
@Infinite_X
@Infinite_X 2 месяца назад
Is it possible to run an Ollama model locally in python??
@SsjHokage
@SsjHokage 2 месяца назад
It’s going to be shit without a quality AI model it’s learned from. You’re just going to get a lot of “noise” responses.
@starfoxBR77
@starfoxBR77 2 месяца назад
And w/o internet, what will be the knowledge base!?
@AdarshSingh-mo1kc
@AdarshSingh-mo1kc 2 месяца назад
No such host error dont know why, I use fedora btw
@garryroach2390
@garryroach2390 Месяц назад
Easy way to kill ur cpu
@serraphimoon
@serraphimoon 2 месяца назад
But less accurate bcoz our laptop cannot handle the computatuon too big
@AB-cd5gd
@AB-cd5gd 2 месяца назад
Do you know any server or provider not too expensive to run AI correctly
@king09426
@king09426 2 месяца назад
Why tf would anyone want to do that.
@badbeastclub6460
@badbeastclub6460 2 месяца назад
Cuz why tf not
@jonjayb
@jonjayb 2 месяца назад
So youre not giving OpenAI your personal info, or if you're a business your company info.
@hepatitisjay
@hepatitisjay 2 месяца назад
To role play my erotic Sasquatch fantasies with someone who won't judge me.
@hepatitisjay
@hepatitisjay 2 месяца назад
So i can role play my erotic Sasquatch fantasies with someone who won't judge me.
@croniumInc
@croniumInc 2 месяца назад
to run a model privately maybe ?
@shiriajin
@shiriajin 2 месяца назад
Obama
@Outdoorstuff5446
@Outdoorstuff5446 2 месяца назад
Ugh go away
@yolow8126
@yolow8126 2 месяца назад
Don’t have a GPU. 😢
@whitecastlept
@whitecastlept 2 месяца назад
Yeah that video is BS because you need to have port 8080 free to run open web UI
@user-te5pu9jq7g
@user-te5pu9jq7g 2 месяца назад
So? Just disable whatever service is running on 8080 and setup a new webserver for this
@TectonicTechnomancer
@TectonicTechnomancer 2 месяца назад
dont use a web ui then, ollama can be used with the terminal.
@goodsoul6675
@goodsoul6675 2 месяца назад
Your bubbing is off sync.
@emilwozniak6015
@emilwozniak6015 2 месяца назад
It is reasonable only if you have cheap energy and a strong device. Otherwise, you will pay the huge bill and have a slow response.
@Cho_osen
@Cho_osen 2 месяца назад
@@TectonicTechnomancer excuse me? it uses a $hit ton of ram/vram.
@rhbrolotek
@rhbrolotek 2 месяца назад
If it influences your bill, I think you talk to it too much :)
@emilwozniak6015
@emilwozniak6015 2 месяца назад
@rhbrolotek read the fuckin# specs dude, this sh#t consumes a lot of power for a small task.
@stebe12
@stebe12 2 месяца назад
@@TectonicTechnomancertell us you don’t understand without telling us, bro.
@vnm_8945
@vnm_8945 2 месяца назад
is this a shit ad?
@nikhilyadav8414
@nikhilyadav8414 2 месяца назад
Do they have chatgpt?
@vendetta.02
@vendetta.02 2 месяца назад
No cus chatgpt isnt open source.
@nikhilyadav8414
@nikhilyadav8414 2 месяца назад
@@vendetta.02 so which one is the closest to chatgpt
@Liam-10
@Liam-10 2 месяца назад
Is it only for laptops or PCs ?
@drfassy
@drfassy 2 месяца назад
For Windows it doesn't work
@PirateSimulator
@PirateSimulator 2 месяца назад
"No internet required" BRO HOW DO I DOWNLOAD THAN
@shapelessed
@shapelessed 2 месяца назад
then*
@TectonicTechnomancer
@TectonicTechnomancer 2 месяца назад
this is why the shampoo have instructions.
@croniumInc
@croniumInc 2 месяца назад
he means to use it lol
@Pahrump
@Pahrump 2 месяца назад
Is it wokified?
Далее
3 Levels of WiFi Hacking
22:12
Просмотров 1,8 млн
How To Access the DARK WEB in 2024 (3 Levels)
15:20
Просмотров 4,3 млн
Level 1 to 100 Impossible Puzzles
17:25
Просмотров 3,5 млн
host ALL your AI locally
24:20
Просмотров 917 тыс.
Better Searches With Local AI
8:30
Просмотров 25 тыс.
Calyx OS - The next big Android Competitor!?
12:06
The Linux Experience
31:00
Просмотров 673 тыс.
Run your own AI (but private)
22:13
Просмотров 1,3 млн
Hacking Windows TrustedInstaller (GOD MODE)
31:07
Просмотров 402 тыс.
the Raspberry Pi 5
17:57
Просмотров 1,1 млн
ЗАБЫТЫЙ IPHONE 😳
0:31
Просмотров 20 тыс.
ЗАБЫТЫЙ IPHONE 😳
0:31
Просмотров 20 тыс.
iPhone 16 - 20+ КРУТЫХ ИЗМЕНЕНИЙ
5:20