Trying to flirt with a girl at Starbucks by telling her your laptop runs ChatGPT natively: “See? No internet connection! It still works!” Her: what “It works even though I’m not connected to WiFi!” Her: why aren’t you connected? Don’t you want to use the internet? “…” “This took 4 hours to set up properly can you just say it’s cool?”
@@hplaptop7747 don't LLMs use VRAM though, only using system ram if VRAM is all used up? That's what happens when I run LLMs locally. It'll use all of my 24GB of vram, and then use 20GB+ of my system ram. Vram is magnitudes faster for inference so there's a huge speed difference between the two ram types.
The first thing would have been mentioning the minimum requirements for the installation and smooth experience....but here are the minimum requirements: Intel/AMD CPU supporting AVX512 or DDR5 and 16GB of RAM I think i will stick to the online models that can still run on decade old machines with 4G ram and proper connection speed but great video as always
Found his video a lil over a month a go... Installed my WSL for my window 11 Lenovo and then installed Ollama.. I have a legal chat bot as well a a dolphin unsecured llama...it's pretty neat I can only run w my CPU..lil slow but cool 😎
@@tigerscott2966 Because.. IF you have a smart mind, then you would of followed NWC and his fanbase (as well as me) In doing this. Whats the point of commenting if you don't understand his videos?? I wasn't being rude BTW
@Finbar_Monroe But - I Don't follow anyone else or use social media period. I just watch video, make a comment and keep moving. If seasoned tech guys want to go gaga over artificial intelligence, that's their business . I am used to being insulted by tech guys. It's the same story - a little knowledge, A few degrees or certifications and a nice salary always leads to the inflated ego and the azz on the shoulder.
@@Finbar_Monroe Only a Chump would insult a stranger online just because his opinion is different from yours. Dallas is not that big..WE can settle this like gentlemen. After I kick your tail, I can post photos of you begging for coffee!
It isn't. You just have too little VRAM, which means the models go to your system RAM, which is awfully slow (and most likely too small, so it also goes into swap)
A server is a computer. You laptop is effectively a server if you want it to be. If you mean you need to connect to a server, no because its able to run offline