So.. You basically divide the workload through multiple threads. It's like instead of one person counting stones in the ground, dividing them and get multiple people to count them. Am I right?
TrisT That's mostly how it works. It's more like sorting the stones by its color and pattern and counting each variety. Using the CPU way, you would need to count each variety separately. If you have 100 different colors and patterns, that would take a long time to count (even if you could count extremely accurate and fast, similar to how the CPU makes up for it's lack of parallelism). The GPU way lets many people count them. Given 100 people (like the GPU), each person would count each variety at the same time.
worldlinerai So, if I use cuda for say rendering, I could leave the heavy lifting for it while the processor could be doing something else, maybe even another render. Right?
Technically, Yes. However, CUDA isn't designed to give you an extra processor to use. It's just to give you the option of using a different type of processor to do your work. GPUs have lots of processing cores (100-1000+) which helps a lot with rendering. Each core can process 1 pixel allowing 100+ pixels to processed at once. CPUs have a small number of cores (2 - 18 in the Xeons) so only 2 - 18 pixels can be processed at once. The Hyper-Threading technology can double that number, but 36 is small compared to 100. What makes the CPU better than the GPU is that each core is clocked at a faster speed and has many built-in instructions like SSE, allowing data to be processed faster. This provides a tremendous benefit to programs that only run on 1 core. In rendering where multiple cores can be used, you would need the CPU to process pixels about 5+x faster to match the GPU's performance.
So is CUDA essentially a sort of API that allows a user to more efficiently communicate with NVIDIA cards? Or is it also an actual hardware modification, where CUDA could not be physically possible on previous cards?
@@abhishekrai48 CUDA also is in the form of an API (i.e. using NVIDIA's CUDA library in C) to abstract away parallel computation tasks to the GPU - but yes its both, the API is the software side but the GPU must be CUDA compatible (have CUDA cores) to take advantage of this.
Ref comments about CPU versus GPU (parallel) processing, I can recall reading an article in Scientific American that showed parallel processing will outrun CPU type processing hands down... (back in the 80's or 90's)...Laurie
I wanna record my desktop/games/etc I have these 3 video encoding stuff and i don't know much about them Intel quick sync video sounds pretty self explanatory making quick videos I don't know which one is faster for recording videos help? 1.intel quick sync video 2.nvidia cuda 3.nvidia NVENC Which one should i use?
wow we are in 2011 and nvidia decide to show us how and what cuda is by using PEN AND PAPER, i woudl have been more interrested by seeing visual demostration, especially with cuda at work compared to cpu and not just a stick figues and box shapes .
im just an avid enthuasts that has no need for actual programming. I just use it for daily tasks like typical hobbysts, like gaming, encoding, rendering, and converting files. How to I get use out of CUDA? more specifically, how do i turn it on and tell my computer to use CUDA instead of CPU?
CUDA is only usefull for CUDA optimized programs (since each run of a loop needs to be told to be split into a different thread). Some rendering softwares have the ability to render using CUDA instead of just the CPU... You wont be able to use CUDA instead of just your CPU all the time, unless you have an OS designed for it (although these kind of operating systems usually have only single uses, such as only computing weather simulations). Anyway, that's how i understand it... I could be wrong
thumbs up if you didn't have a clue what the hell he was talking about from 3:03 onwards, all that matters to me is that CUDA rocks for folding@home!!!
That was pretty easy to see what he was getting at.. CPU process steps down the the list of data GPU process takes the whole left one at one and moves across ...?? Back to the supermarket checkot idea... CPU means all the customers go through one chck out.. GPU means they spread out over several checkouts at once.. ??????? Regards Laurie
My cuda sample can't run maybe because the CUDA version is not compatible with my GPU type. Official documentation did not specify COMPUTE CAPABILITY of each GPU. When I search on GPU setting on my PC and even google it, I can't find this information too. Now I need to install all version of CUDA to test on. It takes time.
Ok, so is is this saying that AddFunction(a, b), with defined, tells Cuda to add the vectors at the same instant and to use all available cores with one i being processed in each core?
to my primitive level of understanding, CPU is deep but narrow, GPU is broad but shallow. What's better for future computing? broadening and increasing threads in CPU while maintaining its depth in individual threads, or deepening the computations in GPU like in CPU while maintaining massive parallel cores?
@youssef0eddoumali theres a major diffferewnce between say the gtx590 and the tesla's, the teslas are biult specifically for the scientific nature, so video editing and gamign certainyl would gain nothign from these cards. and its a reason why becuase of what the teslas can achieve for sceientic nature aswell as 3d biulding as to why they are very expensive. but most of us dont need those cards
Thanks...it was interesting to see an explanation of changing a from a CPU type of process to parallelism....from my amateaur point of view I liken parallel processing to the check out counters in a grocery shop, or big store check out...??....Laurie
Guys I have 2 GTX 590s and whenever I go into Nvidia Control Panel and turn SLi on, my screen goes black and stays black. Could you please thumbs this up so Nvidia can hopefully see this and help me out?
what hes doing on the board is the exact reason i didnt like school, and its simply becuase my dyslexia made my brain switch off from the info neede to be read, especially when after they have written all the garbble on the board they actually expect the people readign it to instantly know what the hell it all means
Incredible. This production - from NVIDIA no less - purports to answer the question "what is CUDA" but fails to even define the acronym. A perfect example illustrating why normal humans believe engineers are incapable of straightforward communication.
@youssef0eddoumali i love cuda itself, doesnt mean anythign this guys showed me or even tried to explain, allowed me to understand anythign he said in the video. which is a shame.
Gert VAN DER PAELT once whe're at the point where AI interprets our intent by reading our notes it could code much better software the us much faster based on our concepts.
@metoxys well you really can not compare other gpu makers to nvisdia , simply becuase you mentioned cuda, and the other gpus dont have cuda on them, which for many reasons renders the questiona dn comment useless. how ever the 590 is justfieid with its price sadly lol becuase of the 2 cards in one and the 1024 cores (of which hardly any software can actually use at this time)
Haha :D Now that would be a peculiar programming language! He was talking about Fortran, an old programming language used for high-performance computing.
@youssef0eddoumali well dyslexia and add aswell as adhd are the same in respects to the underdevloped brain, so i understand where you are coming from. im just shocked that its all this garbel, and not another version which is with visuals and showing like previusly cpu vs cuda etc. who ever thought of this video needs to understand that not everybody in life has it easy to understand stuff like this in this fashion
hence the Reason why I got a 9800GT for 64 cores. 16 Cores on the 8400GS was slooow. If you want some Tutorials google This: dr dobbs cuda There are many Articles right now and falls under HPC. Problems like the Traveling Sales man can be done quicker.
@nvidia lol seriously you give that response to the pen and paper, yet completely leave the comment about dyslexia and people not havent a grand ole time od watching this becuase of whats shown. i do wonder at times how intelligent intellgent poeple are, becuase it seems some blatantly obvious things in life some "intelligent2 people and i deffinately count nvidia as part of that, really are.