Тёмный

Intel's GPU is not what you think 

Coreteks
Подписаться 135 тыс.
Просмотров 405 тыс.
50% 1

Sources and Credits:
Intel youtube channel
Intel Newsroom
Ingebor on reddit
Soon Aun Liaw - IBM St. King Datacenter Render
NVidia RU-vid channel
AMD youtube channel
Intel twitter
Videocardz com (9th gen leaks)
Additional market share stat: www.statista.com/statistics/7...
All footage rights to the respective copyright holders, used for educational purposes. If your content is not credited please contact me directly.
#intel #arcticsound #intelgraphicscard

Наука

Опубликовано:

 

26 июн 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,1 тыс.   
@Coreteks
@Coreteks 5 лет назад
Consider supporting me on patreon if you like this series of videos, I have more planned for the next few days. Thanks for watching!! www.patreon.com/coreteks
@JohnGeorgeBauerBuis
@JohnGeorgeBauerBuis 5 лет назад
Coreteks, I just did. =)
@Coreteks
@Coreteks 5 лет назад
John, many thanks! I just set it up today so haven't put any rewards up but I'm cooking something :)
@grischu8277
@grischu8277 5 лет назад
I'll support you, when I have a credit crad. Tbh, it's more liky for me to have a paypal accoutn, but whatever. If you continue making videos like these, I will have to support you on patreon. You defenately deserve it!
@Tjecktjeck
@Tjecktjeck 5 лет назад
Coreteks when was the day when Intel failed in gaming? Situation is actually vice-versa - intel outperforms amd mainly in gaming.
@Coreteks
@Coreteks 5 лет назад
They haven't, their products are actually really good, just crazy expensive. They haven't failed gaming but are failing gamers.. notice how they are segmenting their CPUs. "Want hyperthreading? well spend an extra $50 and we'll give it to you!" "Want to upgrade from skylake or kaby lake? Buy a great new motherboard even though the new CPUs could use the old mbs"..
@yarox3632
@yarox3632 5 лет назад
📂 NVIDIA └📁RTX 2018 └📁Gaming Value └⚠️ *This folder is empty*
@igorthelight
@igorthelight 5 лет назад
rd NVIDIA/RTX 2080/Gaming Value
@bobbavet
@bobbavet 5 лет назад
AMD/Vega64/4k60fps/
@AbkratzLp
@AbkratzLp 5 лет назад
@@bobbavet Not true.
@TheDoh007
@TheDoh007 5 лет назад
@Y man this type of comment can be seen everywhere
@yarox3632
@yarox3632 5 лет назад
@TheDoh007 Wow that feels good to hear, never expected to go this far!
@rikschaaf
@rikschaaf 5 лет назад
0:54 "Can you think of a current technological titan in a similar position? A company that hasnt innovated for years, while its main competitors slowly but steadily steals their marketshare? A company that has become so complacent, they show very little interest in what their customers actually want?" Eh, Apple?
@beastroll8777
@beastroll8777 5 лет назад
Lol thinking the same thing
@ComputerCraftr
@ComputerCraftr 5 лет назад
Apple stopped innovating after Steve Jobs died and they've been transitioning into a luxury veblen good producer ever since. The iPhone is like the Rolls Royce of cell phones today, people keep buying them for the social status rather than innovation now.
@lightbox8019
@lightbox8019 5 лет назад
Jeez you guys are dense, Apple is gaining market share and value. It innovates in so many different areas, but you only look at the surface. Remind me who is pioneering 7nm chips right now?
@KentSpain85
@KentSpain85 5 лет назад
Sorry, what innovations in the last 5 generations of devices have revolutionized the market? As far as I can see, *good facial recognition hardware* is the only thing in all these years since Mr. Jobs rescued and transformed that company. Ever since, it's been resting on its laurels and hiking up prices like no other company. In fact, due to Apple doing that, devices across the board cost more. Samsung? Google? Yet, salaries haven't increased much, but over 5 years the cost of unlocked, new flagship devices have doubled :P
@GundamReviver
@GundamReviver 5 лет назад
my exact first thought, lol.
@andersonandrighi4539
@andersonandrighi4539 5 лет назад
AMD needed that win much more than Intel. After bulldozer AMD was about to be shut down. The success of Ryzen is a second wind for a company that have a massive debt.
@paianis
@paianis 5 лет назад
AMD still pays Intel to make x86 chips though.
@paianis
@paianis 5 лет назад
Not as much.
@creaturedanaaaaa
@creaturedanaaaaa 5 лет назад
Also for some reason, the US hasn't bailed out AMD at all despite the fact that their collapse would give Intel a complete monopoly over personal computer processors.
@fenrisgaming8982
@fenrisgaming8982 5 лет назад
@Skye Neither Intel or AMD are paying royalties to one another, the both settled back in 2009 to a royalty free deal in which AMD can freely use X86, and Intel can freely use X86-64
@itsmetheherpes1750
@itsmetheherpes1750 5 лет назад
ok, so are you saying that if i wanna buy a laptop now, ryzen are as good as intel ? what ryzen should i take to be as performant as an i7 -8850H ?
@FreekaPista
@FreekaPista 5 лет назад
This video seems too “doom and gloom”. Intels $16B budget was primarily allocated to R&D, so dedicating a full $1B to manufacturing should have a major impact. AMD is approaching GPUs in a very similar manner to Intel, as it is focusing on Compute performance, and, as you mentioned in your last video, the only way that approach carries over to gaming is through Ray Tracing capabilities. Ultimately, gamers need to recognize that GPUs have seen a huge increase in uses beyond graphics processing. And most of these other uses, including mining, simulation, and data management offer a much larger market than PC Gaming. For the foreseeable future, I’d expect gaming performance to be a by-product of designs intended for other uses. I think AMD can capitalize on this trend by building higher performance APUs, and perhaps build dedicated boards with GDDR5/6 built directly into the board. Game consoles have been using this technology for the past couple generations, and with products like the Subor Z+ coming out, we’ve seen it be applied to Gaming PCs as well. Great video by the way - I love your approach of tying in a historical case (IBM’s failure in the 90’s) to Intels current position. Very few tech channels pull on as many sources of information as you do - and none produce content as polished as yours.
@Coreteks
@Coreteks 5 лет назад
Thanks for the kind words, and you are right on the money regarding APUs, more on that later :)
@snetmotnosrorb3946
@snetmotnosrorb3946 5 лет назад
“doom and gloom”? I didn't perceive it as they will die, just lose out in the "regular" computer business. On top of that ARM is catching up and becoming more and more powerful for more and more applications. And who knows what RISC-V can bring. Intel has a of of businesses going on though, so even if they would lose everything x86 tomorrow they still have a lot of stuff to lean on.
@RambleStorm
@RambleStorm 5 лет назад
Luckly Lisa Su pointed out that AMD will have more specialized hardware wich can be a hint that Navi will be a more gaming oriented architecture and i`m inclined to believe them since both Sony and Microsoft are important clients to them.
@Austin1990
@Austin1990 5 лет назад
Talking about other uses, I expect to see Nvidia's hardware-accelerated ray tracing to soon be used for GPU-based render farms for animation studios. All we need is some major render engines to implement it. There is no way that this will stop at games; the benefit to CGI in general is too large.
@SimilakChild
@SimilakChild 5 лет назад
Just want to point out, Ray Tracing / Ray Casting was a technique made by John Carmack in the early 90's. NVIDIA's RTX graphics card line is nothing but a cash grab and delivers barely any significant performance boost over the 1080TI. And while mostly everyone says "Stay Away" from every type of Intel HD graphics, I think the graphics performance on 7th gen and 8th gen CPU's is very good
@introvertplays6162
@introvertplays6162 5 лет назад
As bitter as this might sound...I think all companies are working on making cloud gaming the future, as a byproduct...and I don't want that!
@user-wx1zo9ef3f
@user-wx1zo9ef3f 5 лет назад
In the future stream gaming is coming, that means your game will be rendered in server gpu farms and send back to your screen. For this, a lower net delay is needed.
@introvertplays6162
@introvertplays6162 5 лет назад
Βασίλειος Μπεσλεμές I think we know how this works...but the problem is, that I don't want it to be the norm...I always want to have my own rig. I don't want to rent space on which my savegames, etc. are stored. And I always want to be able to mod my games...I don't want a dull "streaming consoley" future.
@funcleandwuncle
@funcleandwuncle 5 лет назад
i second that, may as well buy a console for the standard experience
@rodryguezzz
@rodryguezzz 5 лет назад
They are doing that. In the future, games will work as netflix. We pay a monthly or annual fee and games are streamed from the cloud. All big companies are already working on it. We already have subscription services like EA access and Xbox Game Pass, a streaming service (PS Now) and there are countless news and rummors about companies working on other streaming services. NVidia is testing Geforce Now, there are rumors about Microsoft's cheaper next gen Xbox working as a streaming machine, EA already talked about their own streaming service...
@WorldKeepsSpinnin
@WorldKeepsSpinnin 5 лет назад
lol fuck that tbh
@mr_beezlebub3985
@mr_beezlebub3985 5 лет назад
Make sense. PC gamers, while numbering in the millions, are quite a small market when compared to server or data center, or other enterprise customers, and a lot of companies are interested in general computing on graphics processors
@perttierajorma8884
@perttierajorma8884 5 лет назад
Ban consoles and almost everyone is pc gamer
@mr_beezlebub3985
@mr_beezlebub3985 5 лет назад
perttierajorma 888 That is the case in a country like China. Or in Brazil, where importing a console inflates the cost so high that it easier to just get a low end PC that was made in Brazil and play games on that
@SianaGearz
@SianaGearz 5 лет назад
Consumer electronics including mainstream computer components is a low margin market. Server is high margin because you can charge extra for reliability guarantees. As for high-performance-computing, you currently have two solutions, NVidia GPUs and Xeon Phi. The latter is not a CPU, it's a few dozen general purpose low power x86-compatible processors on a PCI express card. Fundamentally it stands to be a bit more efficient, but it can already outrun GPUs on tasks that involve branching. I'm still puzzled by how they want to unify the offerings. Xeon Phi stems from Larabee, maybe the new GPU will be Xeon Phi based. In this sense it would be perfectly unified with CPUs as the same software can run on either.
@zamundaaa776
@zamundaaa776 5 лет назад
+Siana Gearz umm just sayin' AMDs GPUs are the big compute beasts...
@SianaGearz
@SianaGearz 5 лет назад
They are remarkable but i haven't seen the AMD FirePro in HPC use personally yet, not sure what's the deal with that. I know they have equipped one supercomputer each in 2014, 2015 and 2016 or so.
@RepsUp100
@RepsUp100 5 лет назад
Awesome video, I learned a lot :)
@xanton1895
@xanton1895 5 лет назад
LOL, didn't expect to see you here
@ArthursHD
@ArthursHD 5 лет назад
Nice video! Not a word on the ARM servers which could cut costs for website hosting, etc.
@AwesomeBlackDude
@AwesomeBlackDude 5 лет назад
Intel business practices is kinda Shady.
@blackknight50277621
@blackknight50277621 5 лет назад
Everybody was Compute fighting
@pwnmeisterage
@pwnmeisterage 5 лет назад
Their bits were fast with lighting
@modernborefare1684
@modernborefare1684 5 лет назад
CPU temps were a little bit frightening
@pwnmeisterage
@pwnmeisterage 5 лет назад
Run them hot with expert timing
@JM-bl3ih
@JM-bl3ih 5 лет назад
You all gave me nerd chills 😂😂
@grischu8277
@grischu8277 5 лет назад
We've seen AMD going full into GPU-compute (they've at least tryed) The Fiji cards were the first step, then their VEGA GPUs. They are (or at least were, at the time of release) monsterst for general computing and not even Nvidia could counter this properly. Of course Nvidia has Tesla, but thats still basically their mainstream gpus with a few optimisations and more ram (Quadro is similar)(considerably simplified). Or I think now it's more like the mainstream gpus are ascended from the server and compute ones (Nvidia Volta architecture: It launched way earlier that the RTX cards. Yes, raytracing cores weren't on the original volta cards, but plenty of tensor cores.) Comming back to amd; It's just that AMD knows that leaving gamers isn't a good idea, at least for now. We are still a bic revenue source for them. Especially in the cpu game and they take every bit of money they can. Not negatively, they've come from very far away. They've dveloped this scalable cpu-architecture that is (sort of) coming to us, but wasn't primarily developed for us. I think they've even mentioned to put more effort into compute, especially on the gpu side. That's how I conclude this. The next decade is going to be very interesting, overall. :P
@tackier52
@tackier52 5 лет назад
Holy shit you can't spell lmao
@Skelath
@Skelath 5 лет назад
Are you talking about the R9 Fury and Rury X cards that got smashed by the GTX Titan X that was released earlier that year? one thing to note is that Intel has access to all of Nvidias patents.
@spinkick9270
@spinkick9270 5 лет назад
Awesome content. It's good for for us original RU-vid tech channels. Not just benchmarks. Thanks
@VirusXAX
@VirusXAX 5 лет назад
And i was thinkin: Intel is done... I learned a lot now!
@deadmetalbr
@deadmetalbr 5 лет назад
These three big consumer-facing companies (Intel, AMD, and Nvidia) ALL need a swift kick in the pants as their offerings over the last decade have been iterative improvements at best. Nvidia's RTX stuff looks cool on paper, but unless they convince Sony, Microsoft, and maybe even Nintendo of RTX's value to gamers going forward, it's not going to catch on in the long term. I miss getting excited about new hardware, now everything comes down to a ~10% performance increase generation-over-generation without mush else in terms of architectural difference.
@N00B283
@N00B283 5 лет назад
perhaps its just the result of reaching the limits for manufacturing transistors, thus the big improvements you saw from the 90s to 2010s is not concurrent anymore, perhaps if parallel processing became a thing again you could begin and see big leaps again, but generally I think it comes down to the fact that we are reaching the limits here. As you see the only way now is to add more cores instead of more transistors on a die.
@haaxxx9
@haaxxx9 5 лет назад
Yea, I looked at the "New" 20 series from Nvidia and was disappointed. The new RTX features is (currently) reflect world on certain objects. Which it great and all but it kills half of your frame rate, A huge no-no for me.
@kommander9638
@kommander9638 5 лет назад
RTX could end up like PhysX.
@tskraj3190
@tskraj3190 5 лет назад
Nvidia's RTX is a great technology and is a large improvement over the previous GTX series. The RTX has real Asynchronous Compute hardware now like AMD and their tensor cores are optimized for faster rasterization. Microsoft DX12 has DXR specifically for Ray Tracing and AMD has Radeon Rays and Pro Render Ray Tracing software. Microsoft and Sony use AMD S.O.C. APUs because they are much more affordable and still perform well in the graphics segment. I promise you next year you will see big performance gains from all of the companies you listed above but beyond that I can't tell you.
@hydrochloricacid2146
@hydrochloricacid2146 5 лет назад
AMD and nvidia have been FAR from complacent. If you look at the situation now vs 2010 , it's pretty clear that wasn't the case. AMD went from losing money on a terrible architecture on a terrible process to having an architecture that is arguably better than intel's in some ways . They have good and stable manufacturing capabilities through 3rd party fabs . In 2010 AMD had just spun off their manufacturing division because of all the money they were losing. Early this decade , AMD was essentially forced to survive on console revenue, which is an incredibly low margin segment. Now they're selling multi-thousand dollar server chips that people actually want to buy. Same goes for Nvidia. Their foray into GPGPU and hardware accelerated AI has enabled them to come from a semi-profitable graphics chip designer all the way to a legitimate threat to intel's bottom line. they went from having large , hot and underwhelming chips that wouldn't yield ( as the fermi series) to their current pascal and turing based parts , which , while expensive , are absolutely dominating the market. In 2010 , intel was definitely far ahead of anyone else. Today , they are the ones doing the catching up.
@faisalhaider007
@faisalhaider007 5 лет назад
AMD has left Intel and nVidia confused and forced them to release polished, and innovation less ultra expensive products.
@wuhanlabtech3580
@wuhanlabtech3580 5 лет назад
How has amd done anything to nvidia?? Lol seriously... as far as intel they have there hands in so much I bet they make more just in military contracts then amd will ever make in anything .. and now amd is totally reliant on tscm so the price advantage will slowly fade away
@Austin1990
@Austin1990 5 лет назад
I do not think Nvidia is shaking. LOL ed k TSMC has little to nothing to do with the "price advantage".
@faisalhaider007
@faisalhaider007 5 лет назад
bro there was no point to launch RTX, they did it just to make sure to summon their name in history as first ever to implement Ray tracing and other AI optimized computing. 10x series seriously coudl easily handle next year untill Q3 start. But they launched it. Intel well launching just after or just before AMD's line up is becoming their Moto. It is quite clear that one can now EASILY build a 1080p 144hz with AMD and that too without selling kidneys... :D
@royk7712
@royk7712 5 лет назад
its cheaper to buy 1070 than vega 56 and its a better card anyway. lul
@royk7712
@royk7712 5 лет назад
@Austin P ofc TSMC is has to do with price advantage, that's why AMD is making so little profit until now. their shareholder want to make some money from dividend. intel on the other hand have their own fab. because of that they have a huge profit and they happy about it
@karanvora2674
@karanvora2674 5 лет назад
I love your videos the points you make are not covered by most main RU-vid tech channels. Please don't change the this genre of yours
@NickonWheelz
@NickonWheelz 5 лет назад
Very informative ! Stumbled upon your vids lately. Very impressed. Keep up the great work!
@Capnsensible80
@Capnsensible80 5 лет назад
There's a lot of misinformation in this video, actually.
@georgesteampowered3910
@georgesteampowered3910 5 лет назад
Great atmosphere, great voice and very informative video. Keep up the good work
@eaglefat9398
@eaglefat9398 5 лет назад
I hope this video blows up like the last one, great info and presentation. imho i think intel might be over investing and about to be caught off guard, it's not that they don't make some of the best products because they do but they are no longer alone in the market and competitors are popping up every day offing near equal performance but for a much more competitive price like AMD's EYPC which might be slightly slower but it cost $4,000 compared to intel's Xeon that cost $10,000
@aformalevent
@aformalevent 5 лет назад
great video! well researched and clearly very well thought out. It's great to see you releasing videos more regularly. Keep up the awesome work!
@yaro7319
@yaro7319 5 лет назад
Love how you do the Videos, really nice picture, audio and info, please keep up
@ny0men
@ny0men 5 лет назад
Thanks for the "military nonsense" thing, u earned your like ;)
@lukeskywalker8753
@lukeskywalker8753 5 лет назад
soy boy
@techworld3043
@techworld3043 5 лет назад
Awesome video again
@akirakitano4150
@akirakitano4150 5 лет назад
Thanks again. You're bringing me up to speed to what is happening in the processors market in the most pleasant, entertaining and educating way. Keep up the good work!
@AfroJd
@AfroJd 5 лет назад
Your videos are fascinating and really well made! Great work!
@chaoticneutral393
@chaoticneutral393 5 лет назад
"Military nonsense"
@cellardoor9882
@cellardoor9882 5 лет назад
Well isnt it. Or are you on of the people claimimg that we need military shit in order to "protect ourselves"
@FlorimondH
@FlorimondH 5 лет назад
Disliked the video for the stupidity of this very sentence.
@nonenothingnull
@nonenothingnull 5 лет назад
Shouldn't be a problem, but it is.
@pwnmeisterage
@pwnmeisterage 5 лет назад
"Military nonsense" eventually trickles down to consumer appliances and gaming technologies, too.
@rock3tcatU233
@rock3tcatU233 4 года назад
THE PENTAGON WOULD LIKE TO KNOW YOUR LOCATION
@VoldoronGaming
@VoldoronGaming 5 лет назад
Since AMD, Nvidia and Intel are moving more towards compute games will have to be redesigned with compute in mine. Ai for enemies could get more sophisticated in games as the technology improves.
@eaglefat9398
@eaglefat9398 5 лет назад
ray tracing and path tracing are very compute heavy and once gpu's catch up it will make games look much better, traditional rasterization is nearing its limit because it uses trickery and hacks to make the graphics look realistic which is why graphic improvements have slowed down in the last few years
@archfxt3619
@archfxt3619 5 лет назад
Oh no , harder ai , as if following the damn train isnt hard enough , now they can dodge bullets too ,DAMNIT SMOKE
@archfxt3619
@archfxt3619 5 лет назад
Mariano true dat
@Austin1990
@Austin1990 5 лет назад
Mariano "i don't need them to be real"... after you talk about wanting cinema-quality graphics. What you really mean is that graphics are more important to you than game mechanics. Opinions on this vary. And, AI requirements would be far smaller than ray tracing requirements, so we are not looking at an either/or issue..
@petertremblay3725
@petertremblay3725 5 лет назад
Developer here regarding enemies, we can make them so lethal that not a single gamer would be able to beat them but in reality we are force to dumb them down so gamer are not too frustrated from losing all the time. A good enemy AI in the industry is one that look intelligent but let the player outperform him without the player noticing.
@itumelengseeletsa6910
@itumelengseeletsa6910 2 года назад
RU-vid just recommended this to me, but cant notify me when there is a new upload.
@user-si6xh1yh9y
@user-si6xh1yh9y 5 лет назад
I left a sub and a like. I loved every single bit of it, calmly talked. Detailed, rich. I hope to see more content like this from you in the future.
@Etheoma
@Etheoma 5 лет назад
Nope PC still beats out data centre for Intel's revenue by a mile, even there operating income in PC is still ahead of the data centre. So it's wrong on both counts, BUT!!! The real answer is that AMD if they can beat Intel will lower profit margins and decrease sales in the data centre for Intel which is a double whammy, furthermore Intel have more to lose in the data centre market as last year they were over 99% market share effectively 100%. This is why Intel is losing there minds because data centres do make up ~45% of there operating income, you take away 30% of those sales decrease the margins by say 40% and you are talking about Intel losing over 25% of your companies business, that is not just gound shaking that is removing the ground from under there feet. While consumers are dumb and it will take AMD 2 - 3 years to get to 50% total market share and AMD wont be decreasing Intel's margins much, so while it's a bigger part of there company it is also more secure. And when I say 50% PC market share I mean laptops, desktops from OEMs, direct CPU sales etc, and while AMD has made massive inroads in the DIY space, they are still lagging behind where it actually matters and that's laptops and desktops from OEMs which still make up the large majority of CPUs sold. The DIY guys are for the large part not who I would consider dumb consumers, I am talking about the guy who gets his computer from Dell who has probably never even heard of AMD apart from maybe hearing they were shit tier. Those are mainly the people who AMD need to convert and that just takes time there is no way around it, and no I don't actually think they are dumb they just don't pay attention because it's something that doesn't really matter to them. They have always bought Intel and Nvidia and it will be really really hard to catch there attention long enough to realise there is another option and that it is equal to if not better than Intel. Also focusing 14nm on the data centre also makes a ton of sense because they can sell a given area of silicon for a much higher price in the data centre than they could for consumers so it make absolute perfect sense, and while they may lose some desktop sales, it would be much worse to lose a contract with a company because they wont come back until AMD makes a major fuck up. While that consumer will more likely than not come back to Intel if they are even going to make a purchase in the next 2 - 3 years, by which time I expect Intel expects to be back ahead by then.
@Coreteks
@Coreteks 5 лет назад
Thanks for the correction, my point was that Data is growing tremendously and Client is declining. I should have made it clearer that I meant the PC enthusiast market was a tiny portion of their overall revenue compared to Data. For 2018 they will probably announce close to a 50/50 slip between Data/Client, but you are correct that Data was lower than Client for 2017. As for most buyers prefering Intel, it makes sense, Intel has a really powerful and recognizable brand, and will continue to do so. I have a video on this topic in the works. Thanks for the comment :)
@Etheoma
@Etheoma 5 лет назад
But my point was that you shouldn't be buying a brand name you should be buying a CPU XD, but I know what you mean as I said the average consumer doesn't have the interest to find out which is actually better, so staying with the brand you know makes sense to those people. I did say something to that effect in the third paragraph.
@Coreteks
@Coreteks 5 лет назад
Yeah I got what you said, and I'm agreeing :)
@zamundaaa776
@zamundaaa776 5 лет назад
The 'dumb consumers' that mainly buy pre-built PCs and Laptops are mostly just looking at frequency and number of cores if at all, and on the price. That's where AMD is much better than Intel, so if they keep that they can gain some market share there. And they have to, as you said. But Intel only focuses on 14nm because their production is very much stuck on it. Silicon is pretty cheap and Intel doesn't really have to sell more of it; if you're halving your transistor size you can either make the processors more powerful (they've been very reluctant with that), sell them for cheaper as you can make a little less than 4x the chips from a single wafer of silicon or of course sell a little less than 4x the number of chips. The 4x estimate of course isn't accurate at all because 14 or 7nm doesn't really describe the actual whole size of a transistor but instead is kind of of a marketing term but let's just roll with it for simplicity
@chadmulligan5629
@chadmulligan5629 5 лет назад
amd has the console market locked down though. amd WILL be on top one day when people realize the true value of amd components. intel putting out overpriced crap because "they can" is fixing to burn them.
@pigboybig
@pigboybig 5 лет назад
Very interesting. You are like AdoredTV.
@iszotic
@iszotic 5 лет назад
Competence is good for the market :vvvv
@johnbeer4963
@johnbeer4963 5 лет назад
wow, burn...
@gabrielwhite3890
@gabrielwhite3890 5 лет назад
@Mariano one is adoredtv. But I prefer Coretek's way of presenting information, it's a much sleeker look to his videos.
@baddog9188
@baddog9188 5 лет назад
That's an insult, not a compliment...
@eliadbu
@eliadbu 5 лет назад
he is not, they have totally different perspectives of technology and companies, and how they analysis the subject.
@jowarnis
@jowarnis 5 лет назад
Amazing video, I really enjoyed it, liked and subscribed!
@vassio2249
@vassio2249 5 лет назад
Dude this video is gold.... I am glad that found it so early .. keep up the good work mate!
@damientech88
@damientech88 5 лет назад
Intel and Nvidia will be pushing each other to see who can create Skynet and destroy humanity first.
@tskraj3190
@tskraj3190 5 лет назад
Great Video! IBM forgot their roots years ago but I don't think Intel will do the same. I believe Intel is focused on A.I. and supercomputing but I also believe that will trickle down to the consumer segment and provide a more competitive stance in the discrete GPU segment. I do have high hopes for AMD, and I believe the next few months they are going to really shine and provide competitive hardware in both the CPU and GPU segment. As for Nvidia it is a company I lost all respect for due to a few of their anticompetitive practices like APP (Approved Partner Program) and GPP (Geforce Partner Program) and the lawsuits they filed that killed every GPU manufacturer in the market except ATI. And I don't want to get started on the allegedly stolen documents from AMD.
@abram730
@abram730 5 лет назад
Quite the opposite for me. GPP was just branding nonsense, that doesn't matter to me. APP was simple a sales agreement, like amazon affiliate program. AMD does the same things and you apply a double standard. AMD was always the shady company, acting in criminal ways. They would run libalist PR disinfo campaigns to destroy devs that partnered with Nvidia to force them into their own partner program. They would release fraudulent slides on their blog and in press releases and distribute those to media from their blogs or offer instruction on how media could produce their own. There was a disclaimer that AMD didn't stand behind the claims of their employees, as legal cover. For example Crysis 2 DX11 patch was the first I noticed of it. Fake and deceptive wire frames, lies about the water being rendered under the ground, and lies that tessellation was the cause of AMD's poor performance. Failure to optimize for GPU writebacks was the issue. Water was tiles, a small block was simulated with directcompute, and then tiled out. The wireframes were fraudulently presented due to this as the drawn wireframe was also tiled out. Water not visible was culled, as cryengine does use occlusion culling, however because the water tile was computed on the GPU, it need to be written back and in DX11 the driver handles i/o, so the driver needed to optimize for that. Crysis 2 removed geometric popin using virtual dicing, yet AMD's comments had no connection to how tesselation was being used. Tessellation was used to walk up geometric detail from 2 triangles by subdividing them as you moved closer. When you hit 64x tessellation then that was pushed into an inflated buffer and tessellation reset to 1x and began walking up. AMD in essence was demanding that the game count to 100 without using numbers greater than 2. They put blatant lies to their customers as a demagogue, to punish developers that chose their competitor. That is mafia level criminal activity. Join AMD's partner program or else they will destroy you with lies. That is a protection racket.
@tskraj3190
@tskraj3190 5 лет назад
@@abram730 I don't know too much about tessallation but I do Know the HD6900 and up series do an amazing job at it. Nvidia's APP "Approved Partnership Program" was for for vendors who wanted to continue making Geforce cards after the GTX 400 series had to solely and only manufacture Nvidia cards. Nvidia gave XFX the boot because XFX started manufacturing AMD cards. When Nvidia threatened other vendors with what they did to XFX, Asus, MSI, and Gigabyte intentionally rebelled and the Approved Partnership Program was disbanded. And as for GPP, Geforce Partnership Program is not really dead. Vendors who changed their naming scheme for Radeon cards are still going to use those new names on Radeon cards. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-H0L3OTZ13Os.html
@snetmotnosrorb3946
@snetmotnosrorb3946 5 лет назад
Abram Would you mind backing up your bold statements? I've been into computers since Windows 3.11 and closely monitored the development and know most about all shenanigans going on, but those allegations are completely new to me.
@tskraj3190
@tskraj3190 5 лет назад
@@snetmotnosrorb3946 Which ones? All of the web pages about Nvidia giving XFX the boot and Asus, MSI and Gigabyte rebelling against the APP have been removed but I did find one that doesn't go into as much detail but gives some backing up. www.fudzilla.com/news/graphics/20453-xfx-officially-stops-doing-nvidia When it happened I ended up switching to Radeon in 2010. I only bought XFX or EVGA because I got burned from buying BFG.
@snetmotnosrorb3946
@snetmotnosrorb3946 5 лет назад
@Timothy Isenhart I know about all that stuff. I was targeting the other guy above you, Abram Carroll.
@Fridgemusa
@Fridgemusa 5 лет назад
Great video mate, very interesting times ahead :)
@dubtfoul9514
@dubtfoul9514 5 лет назад
Great video! Nice work and analyse, good job man
@andrecostin1288
@andrecostin1288 5 лет назад
Great video
@mr.brokendreams9721
@mr.brokendreams9721 5 лет назад
AMD has carried the computer market on their own for some time and kept Intel and Nvidia on their toes and fingers
@wuhanlabtech3580
@wuhanlabtech3580 5 лет назад
They have done nothing but give up being a manufacturer and add more cores .. with tscm holding AMD's future in their hands now slowly amd will lose price control also
@user-wx1zo9ef3f
@user-wx1zo9ef3f 5 лет назад
@@wuhanlabtech3580 what do you mean? Amds cpu prices gona drop even lower because of the competition. Extra need for intel: factories cant keep up Extra need for amd and nvidia: companies will fight to make contracts for extra production (lower production price)
@abram730
@abram730 5 лет назад
Discovery in a lawsuit showed that Nvidia was trying to get ATI on board with a computing push against Intel back before AMD bought them. They lost the lawsuit because Nvidia gave ATI the heads up about the 8800 ultra, so they could have a competing product. Back then it was Nvidia + AMD vs. ATI + Intel. AMD's GPU business harms their CPU business because they still make GPU's that only perform best when using an Intel CPU. This is because they only accept a single stream of instructions and only have single threaded drivers. Intels higher per thread IPC is required.
@Austin1990
@Austin1990 5 лет назад
Mr.brokendreams AMD has not carried anything, ever. They have forced price drops and cores increases from Intel and Nvidia, but that is not the same as carrying the computer market.
@TheDog7118
@TheDog7118 5 лет назад
Just remember how mush AMD suck back in 2016 lol when happen again.
@Painted_J
@Painted_J 5 лет назад
You explain complicated things so well even i can understand, you earned a sub
@RichardOpokuEngineer
@RichardOpokuEngineer 5 лет назад
Any noticed at 2:19 that the graph is just one plot inverted against the other? a mirror of itself?
@Jaba-hs6vd
@Jaba-hs6vd 5 лет назад
I think that non of the three company's care much for the gaming market both of them investing heavily on the server market,it's where the huge money is But yes today's gpus are enormously flexible ,fast,and powerful,they can do many things ,they are good at many things ,even in cpu workloads gpus are doing better with brute Force
@chadmulligan5629
@chadmulligan5629 5 лет назад
every current console runs amd apus. amd loves us gamers.
@chadmulligan5629
@chadmulligan5629 5 лет назад
smaller market plus they sell for less. i mean come on, amd clearly appreciates the gaming market. wont be long before amd is synonymous with gaming.
@Austin1990
@Austin1990 5 лет назад
Chad Mulligan Meh, Vega was a clear departure. Perhaps, Navi will change that. But, competing against the RTX cards, where Nvidia is pushing revolutionary gaming-focused tech, will be difficult from a mind-share perspective.
@zarkaztick8973
@zarkaztick8973 5 лет назад
*This video of Intel's GPU is not what you think* That would be a better title. This with the GPU thumbnail was kinda click-bait. And like a fish, I took the bait and ended watching the whole vid.
@jonathanoxlade4252
@jonathanoxlade4252 5 лет назад
Actually it's not it was a dedicated cpu built to a pcie 16x but intel are working on a proper gpu tho
@Nintega2K
@Nintega2K 5 лет назад
U plan on making a Video in the future on what Intel, AMD, & Nvidia's new focuses on Compute mean for Gaming in the future, both PC & Console?
@Idontgivechainsaw
@Idontgivechainsaw 5 лет назад
Hmm. Just found this chanel. Its great. subbed. Keep it up :)
@dissquished3415
@dissquished3415 5 лет назад
Deep Learning.. Hmm? Create Astroboy
@ethanramage9624
@ethanramage9624 5 лет назад
i bought the R3 1200 to afford a gtx 1060 3gb... if i went to intel's i3 8100 i would only have afforded a gtx 1050....
@Coreteks
@Coreteks 5 лет назад
A lot of people in that boat
@madweather1865
@madweather1865 5 лет назад
Nice, game on bro
@thefirestate
@thefirestate 5 лет назад
Pairing the r3 with 1060 is just an incredibly better decision compared to the i3 with 1050, good decisions bro, I recently went with the same cpu but paired it with a 1070 and it's rocking at my monitor's 1440p!
@combatantezoteric2965
@combatantezoteric2965 5 лет назад
Can it do 60 fps in most games, including ac origins and other CPU demanding games? My overclocked i5 3550 struggles on most games now...
@cclacss
@cclacss 5 лет назад
I bought a used i7 4790k instead and went with the 1070 i got the whole system for 600 dollars
@rajtheneo
@rajtheneo 5 лет назад
Very nice and informative video. The narration is top notch along with the voice.
@TheGodlessGuitarist
@TheGodlessGuitarist 5 лет назад
Hi Coreteks, really interesting video. It would be great if you could include your sources in the description. This would make your content more valuable to research depts in many companies. I would have included your video as a reference in an internal project report on AI technology.
@subscribersnovideoschall-ln5kv
intels gpu in a nutshell Intel HD in a CARD love your vids as always
@SireDragonChester
@SireDragonChester 5 лет назад
Don’t forget that spectre and meltdown prob hurt intel a lot too. And ppl just tired of intel over all. AMD answered that call so ppl jumped ship for better price and performance. Intel has been dominating for last 10+ yrs. was only matter of time before they start loose that lead. I’m going be switching sometime soon to amd Ryzen. Time upgrade my old i7 3770k soon. And my first pc was Tandy 1000 SL2 pc lol FYI, RTX cards are just refreshed version Titan V with few minor changes. Not worth the price atm, here in Canada RTX 2080Ti is going for $1699.99 Canadian. Lol pass I’ll stick with my GTX 1080TI for now. Going wait see what next AMD brings out and Maybe intel.
@user-zu1ix3yq2w
@user-zu1ix3yq2w 5 лет назад
Sire DragonChester it was Ryzen.
@snetmotnosrorb3946
@snetmotnosrorb3946 5 лет назад
I believe the 7nm products coming next year will be very... delightful.
@SireDragonChester
@SireDragonChester 5 лет назад
Snetmot Nosrorb Agreed, can’t wait see what AMD bring out. I’m on the fence to get Ryzen 2700 or wait. My old 3770k 3.5 GHz still Gaming fine with GTX 1080Ti, But it’s getting on be 5+ yrs old. So been temped upgrade. Also like the direction amd has been going. Tired of Nvidia anti consumer ways and be interesting see if intel does bring decent gaming gpu of GTX 1080 or better.
@dra6o0n
@dra6o0n 5 лет назад
Steel and aluminum tariff affects Canada. Oct 1st tariff of 10% import GPU to USA added on, Jan 1st will add another 15%. You will see price of goods from say USA for computers parts soar through the roofs. And the USMCA deal will disallow Canada to trade with other nations on some things.
@Timeriz
@Timeriz 5 лет назад
Wait.. really can you get a rtx 2080ti for only $1700 Canadian, thats insane here the cheapest one i could find is $1966 Canadian
@savavesavagelif5846
@savavesavagelif5846 5 лет назад
I really enjoyed your content this was spectacularly put together. Honestly the computing age is baffling right now have you seen udoo bolt? Omg i cannot wait... Ive been learning to code and such... Because things like the snapdragon cx and the ryzen 2200g and the ryzen 2400g apus and their counter parts. A great video would the rise of the single board computer... Rasbury pi tinker board Adriano udoo
@kimh9337
@kimh9337 5 лет назад
I like your videos. Informative with speculations/predictions with proper research backing it up. keep to this line of going as RU-vid are already swimming with 'common review' channels. Not that I mind the review channels at all. But going that route will make you "another one of those". where technical news with accurate predictions (hopefully) and technical insights are still rare gems to stumpe upon.
@Capnsensible80
@Capnsensible80 5 лет назад
One thing AMD has always excelled at is value. They don't compete at the high-end GPU game, but they have great performance per dollar in the low/mid range CPU/GPU department.
@vgchat
@vgchat 5 лет назад
Intel's GPU is not what you think... Oh yes it is, I've already heard about it before.
@u1richh
@u1richh 5 лет назад
It's Larrabee, isn't it?
@gb34a
@gb34a 5 лет назад
Gaming market is one of the fastest growing market nowadays. It's twice as big as selfdriving car market, stupid video.
@piyushvaidya5086
@piyushvaidya5086 4 года назад
Austistic talk. Do not look at the gaming market as a whole, gaming market involves both game creator companies as well as their customers. Talk about personal computing industry vs the B-2-B computing industry. It is quite obvious who pays more, and is eill to pay more. There is no use of having a customer base of millions, if each contributes only 250$ average. The server side computing industry is going to boom like the internet did. And that's a very generous prediction. Do not be salty about that. Willingly or not, you will br paying for cloud gaming services within a decade or two.
@gb34a
@gb34a 4 года назад
@@piyushvaidya5086 Gaming market means customers who buy games, just look up some statistical methodology on market researches about the topic. When I used twice as big I didn't mean twice the number of customers (obviously a B2B buyer pays more as an individual) I meant twice the revenue. "The server side computing industry is going to boom like the internet did" do you have some research data or forecast on this or you just pulling this out of your ass? Do you know that the gaming industry is also booming as well? It's is the biggest entertainment market nowadays. Did the research u based ur assumptions on mentioned that the growth rate of the server side computing industry is greater than the growth rate of the gaming market as well?
@gb34a
@gb34a 4 года назад
@@piyushvaidya5086 btw my original comment compared gaming market to the selfdriving car market. Even if server side computing industry will be bigger in the future do u really think gaming market is not a priority for intel in the present when r&d cost of a gpu is huge? Do you really think they would just avoid todays biggest market when companies wants most out of their investments? Did you know that about 50% of nvidias total revenue is coming from selling gaming products? This is why I still think this video is stupid as hell.
@rcarkk
@rcarkk 5 лет назад
You make very good videos. Keep up the good work.
@oliverjunge8671
@oliverjunge8671 5 лет назад
I think you are right about those Intel GPUs. I wonder if they will consist of x86 style cores, like they planned with their first venture into GPUs that never came to fruition as consumer graphics cards. I could think of some interesting usage scenarios for cards like that, but I guess they will stick this time to simpler shader units.
@Republic3D
@Republic3D 5 лет назад
Why are military applications "nonsense"? If it's one thing that drives technology forward, it's military applications. The strides made in technological development during first WWI, then WWII, then the cold war - are simply immense. Even the internet started with a military purpose. The 20th century saw the greatest leap in technology in history, and much of the reason was competition between superpowers - not only purely military - but also the space race. Even today this is true in the fields of robotics, aviation and quantum computing. Other than a few universities, quantum computers are found at corporations like Lockheed Martin Skunkworks. So why are military applications nonsense? Other than that, great video.
@MuffinTastic
@MuffinTastic 5 лет назад
he's simply saying the applications themselves are nonsense, he's not dismissing the demand for innovation the nonsense creates.
@Republic3D
@Republic3D 5 лет назад
@@MuffinTastic They're not nonsense though. They make a lot of sense.
@Republic3D
@Republic3D 5 лет назад
@7deathdragon Even though it's not directly shared with the "common folk" right now, it eventually finds its way there. And you're right that it's not as much as it used to, but it's still happening. These days it goes both ways as well. The military sometimes adapts commercial products because it's so much cheaper. I was mostly ranting about him calling it "nonsense", which I strongly disagree with.
@cube2fox
@cube2fox 5 лет назад
The only reason why some non-military applications were coming from military applications is that they were a byproduct and huge amounts of money went into military funding. This is an extremely inefficient way to invest in non-military technology. It's like making food by giving crop to farm animals to produce milk or meat. It's much, much more efficient to make bread directly from crop. Then you get 10-20 times as much for the same resources. Saying "investing in military for civilian purposes is useful" is like saying you should fight world hunger with meat and cheese because many people actually eat it. You can get useful technology out of incredibly inefficient processes when you invest enough money in those processes.
@Republic3D
@Republic3D 5 лет назад
Trurl, you're partly wrong. Civilian applications from military research is not just a byproduct. Often the very same product finds it's way to the civilian side. One example is ABS, anti-lock brakes. Another is GPS. A third is rocketry and space flight. Regarding the funding, I think we need to differ from time of conflict to peaceful times. During WW1, WW2, Korean War, Vietnam War, the research made incredible strides forward. The benefits from the relatively small amount of the war budget that was allocated R&D was amazing. If you look at the Cold War over all, I think you have a better point. But all research needs an incentive. The last 15 years or so, we've made progress - but only in limited areas. The only incentives we have now is that people want a cooler gadget or a faster computer.
@mightythor136
@mightythor136 5 лет назад
📂 NVIDIA └📁RTX 2019 └📁 any good cheap thing └⚠️ This folder is empty
@peterbezak5204
@peterbezak5204 5 лет назад
Wow, you sound really deadpan! Great video btw!
@Jajalaatmaar
@Jajalaatmaar 5 лет назад
I really like the trippy music you put under it. Gives kind of an Eve vibe.
@bitcoin-tradinglive6446
@bitcoin-tradinglive6446 5 лет назад
Except for the price of Intel CPU's have not dramatically changed, not even 24 hours when this video is posted you can still buy there 17-8700k for simply $30 more than their I7-7700k, you simply went to page two of amazons sellers and scrolled through how much individuals were trying to resell it for. (E.g. 3:29) The current price of the new generation chips can be found here: goo.gl/Un2hxu . We don't understand what you mean by their quantity supply is low, when not much has changed in price variation || meaning that supply-demand has only slightly adjusted due to the shortage, which wouldn't dramatically bring it away from the markets equilibrium price.The chart you referenced to show how AMD is beating Intel in sales does not reveal the X and Y of the graph. When you twist your facts, to the point where they're so convoluted we don't know what your saying is true or false, that discredits your argument. I can't necessarily say what your saying is true, it's more than you are creating problems that don't exist.
@vh9network
@vh9network 5 лет назад
Why doesn't Intel just acquire nvidia already. Going on their own just seems silly to me.
@FlergerBergitydersh
@FlergerBergitydersh 5 лет назад
Because Nvidia doesn't want to be bought and is happy as a separate company?
@WinterCharmVT
@WinterCharmVT 5 лет назад
because Nvidia is already ahead, and has plenty of money. Why would they sell? Acquisition requires both people to agree
@vh9network
@vh9network 5 лет назад
Because money talks folks. You both seem to think of nvidia as a person and not a company. All it takes is the person at the top of Nvidia and Intel to see eye-to-eye on profits for both and agree to a merger and boom, you have a merger.
@Capnsensible80
@Capnsensible80 5 лет назад
Except that's not gonna happen because it wouldn't be financially beneficial to nVidia at all, so there's nothing for them to "see eye-to-eye on profits."
@johnmarks227
@johnmarks227 5 лет назад
Wouldn't be approved in the U.S. might be a factor also.
@Ms3DiT
@Ms3DiT 5 лет назад
The mathematics driving you graphs are blowing my mind. Was that a mirror image?
@minthos4045
@minthos4045 5 лет назад
well put together! thanks mate!
@gertjanvandermeij4265
@gertjanvandermeij4265 5 лет назад
Intel is DONE ! The giant has fallen ! Greed and god syndrome killed them . Xeon ruled ...... So did the dinosaurs ! ;-) 2019 = All AMD ! R.I.P. Intel !
@gabrielwhite3890
@gabrielwhite3890 5 лет назад
Nvidia is still draining pockets. But it is hard to think... My children will never have known what Intel was?
@abram730
@abram730 5 лет назад
Nvidia always wanted to charge Intel money going back to ATI. Nvidia is still innovating. Intel just keeps stamping out the same thing with ~3% improvements. Nvidia is still hitting double digits.
@gabrielwhite3890
@gabrielwhite3890 5 лет назад
@@abram730 Have you heard of geforce4 NV17?
@abram730
@abram730 5 лет назад
I had a GeForce4 Ti 4600(NV25) back in the day, why? You talking about Nvidia releasing the GeForce4 MX, that was basically a GeForce2 MX with better memory? Nvidia was shady back then, and did lots of rebranding. Nvidia was rebranding NV80 for the longest time. They started turning around with Fermi and put the pedal to the metal. Now AMD is doing the rebranding and a lot of shady things that don't get reported. I also used AMD CPUs exclusively in hundreds of PC's I built back then. It was usually AMD+Nvidia and Intel+ATI back then. Things would be different now if AMDs CEO was willing to step aside and merge with Nvidia back then. AMD and ATI didn't fit. It was internal war, as a lot of their very talented people were very pro-Nvidia. War as in calling for mass firings at ATI in company wide emails, and leaking ATI's roadmap to Nvidia. It still isn't fixed as the graphics division still cuts their throat on the CPU side.
@gabrielwhite3890
@gabrielwhite3890 5 лет назад
@@abram730 Abram Carroll They have not gotten better, they simply have eyes watching them. I would not pay 90% more for 45% more performance between generations. They cannot kill AMD at the moment because they need a meaningful patent to infringe and AMD has enough money to survive the strike back that Nvidia wants to pull off. 3dfx and S3 are simply remains picked up by Nvidia, and they are absolutely shameless in throwing companies under the bus. Sgs Thomson was the original manufacturer of Nvidia's graphics card, but as soon as they jumped on TSMC they called them undesirable when the Kryo2 was announced, when they have been using them until the Riva! Not only that, but they advertise tesselation which cripples radeon cards but only pulls Nvidia graphics cards back, they put out arbitrary numbers like Gigarays, but when you actually look at the numbers presented you realised that the Powervr 6xt gr6500 has similar performance which does not explain the inflated prices. Moving to Pascal, the 10 series founders edition had a higher MSRP which had an effect on AIB cards, it made their prices even higher, and I thought it was just a coincidence, but Nvidia was just testing the waters. When the 1080ti was launched, the founders edition was priced quite reasonably, and the AIB cards prices have not jumped. So when Nvidia priced their founders edition at $100 higher, I knew what they're trying to do with this pricing, and lo and behold, the 2080ti prices for AIB cards are astronomical. Nvidia does not do much with their drivers, but many AMD fanboys think otherwise so I don't have much to say on that front. But they know very well that they do not have much leverage with a competitor, so they have to quickly make a profit, even a bit (in terms of a massive company) will do. AMD have nothing to compete, their cards are better in rendering graphics, but they do not have the upper hand in terms of things such as N-body calculations and other Physics. AMD in this quarter sold twice as many CPUs as Intel this quarter and slightly better in revenue, which establishes that they now have a commanding lead as Intel shortages raise prices. Yet they seem to be silent, but when they are silent they emerge, Vega was a flop because AMD could not keep their mouths shut. Nvidia knows very well how to respond to anything AMD has to offer, more CUDA cores. Stream processors just simply cannot cut it. AMD might be toast in the graphics division, and Nvidia will make incremental improvements slowly. The difference between 2011 Intel vs Nvidia today is that in 2011 AMD had nothing, but today Nvidia can still be hit with a new architecture, Vega was just not good enough not terrible it needs improvement not an overhaul. So for now, Nvidia is still keeping all eyes open.
@calical26
@calical26 5 лет назад
soon gpu's will replace cpu's
@deadhell304
@deadhell304 5 лет назад
Maybe, but not likely inside consumer computers. They excel at different tasks that the other is unable to efficiently do.
@tskraj3190
@tskraj3190 5 лет назад
I predict that motherboards in the future will be one solid S.O.C. that controls processing and graphics workloads but will still have DRAM slots, expansion slots and Sata ports.
@user-wx1zo9ef3f
@user-wx1zo9ef3f 5 лет назад
Cpu use general cores. Gpu cores are massive but they can only compute geometry. Gpus cant do anything else than rendering. The gpu is controlled and directly connected with the cpu.....
@sirgalahad4861
@sirgalahad4861 5 лет назад
Wow better content than bigger channels and you only have 5k subs! I subbed.
@JohnVance
@JohnVance 5 лет назад
I like your historical computer renders, very cool
@Brisou394
@Brisou394 5 лет назад
What dou you think about the new gaming Intel processor comming shortly ?
@alberthakvoort8473
@alberthakvoort8473 5 лет назад
This gave me a whole new perspective. To look at all the stuf going on with the 9th gen and those xeon refresh cpu's. Thx bro
@user-sj3fp2xq2m
@user-sj3fp2xq2m 5 лет назад
I very much liked the overall tone of the video. Good job.
@MissMan666
@MissMan666 5 лет назад
I liked this, will be comming back. How about making something on Zen 2 ?
@Xalantor
@Xalantor 5 лет назад
Interesting video and the calm voice is a nice bonus.
@ACSBranding
@ACSBranding 5 лет назад
Thoroughly enjoyed this one. Great job.
@FyyMyy
@FyyMyy 5 лет назад
Awesome Video! Keep up the good work. 💯
@anotherplum
@anotherplum 5 лет назад
You are the born super star in documentary's. I love this channel!
@vladmods
@vladmods 5 лет назад
Excellent video, man. Keep up the good work!
@finbarmanley2208
@finbarmanley2208 5 лет назад
The cradle of the nest is the frist motion of the univse so when you draw the shape of five layers to roll around 3°4°
@aashaytambi3268
@aashaytambi3268 5 лет назад
Damn this is good! You should keep it up
@JamesMichaelDoyle
@JamesMichaelDoyle 5 лет назад
smooth voice you got there bud, and an accent that is equally pleasing too.
@crazyoldhippieguy
@crazyoldhippieguy 5 лет назад
26-02-2019.Thanck you,right again.
@macmac436
@macmac436 5 лет назад
I really love this format, please make more!
@ahem1190
@ahem1190 5 лет назад
ty, i had no idea intel was going to the future with new gpu cards, should be fun to see gaming rigs change in the next few years
@e1337prodigy
@e1337prodigy 5 лет назад
I really like your videos. Keep it up.
@GattMiffin
@GattMiffin 5 лет назад
Does anybody know what device the person at the computer at 0:08 is using?
@__dudewitagun__4607
@__dudewitagun__4607 5 лет назад
Interesting! We will see what the future brings....but your words make a lot of sense
@zamundaaa776
@zamundaaa776 5 лет назад
For AI and autonomous cars I see that something like the new neural chips IBM is creating will be a *big* part of the future. AFAIK the current prototypes are running on less than half a Watt and have the compute power to do image recognition on a 100 cameras at 60fps. Doing such things on GPUs is just too slow and mainly inefficient in the long run, although they'll of course still be pretty important for parallel computing
@alexamorphical3947
@alexamorphical3947 5 лет назад
Very informative video! cheers.
@1982masood
@1982masood 5 лет назад
Hey guys just asking you.. is it not possible to have a pci card which can hold cpus.. and more we add more cards we get more cores ang ghz speedm
@markd8799
@markd8799 5 лет назад
So you took a line graph and flipped it to show the mirror of Amd versus intel?
@EricFortin
@EricFortin 5 лет назад
Thank you ! i didnt know Intel were so big in the autonomous car market.
@jacobwilkinson7970
@jacobwilkinson7970 5 лет назад
I saw a video by Linus Tech Tips talking about a failed Intel graphics card based on software and not hardware I wounder what direction they are heading now
@feafajic6392
@feafajic6392 5 лет назад
Great work and very educational video.
@emirmasinovic
@emirmasinovic 3 года назад
Informative. We have to remember what the companies plan to do, not what they say.
@AlejandroRodolfoMendez
@AlejandroRodolfoMendez 5 лет назад
nice video. i am subscrited now. keep the good work.
@deansmith4752
@deansmith4752 5 лет назад
the creation of high density FABs are the road upon which these advances will be made, sub 5nm would be shift which will create the highway they need
@tejashnayak6090
@tejashnayak6090 5 лет назад
The background music is like some sci-fi movie
@RandornCanis
@RandornCanis 5 лет назад
Further that point, and announced earlier this month, AMD CEO Lisa Su will present the 2019 CES opening keynote -- a role traditionally held by Intel. This video closes with Brian Krzanich presenting the 2018 opening keynote in what would be his last; Brian resigned from Intel CEO in June amidst personal legal troubles.
@ratkentheinfinity9841
@ratkentheinfinity9841 5 лет назад
This video is very well. Narrator is very talented throughout the whole episode and I felt myself like watching one of BBC's documentary. Keep this great work going on.
@heart4011
@heart4011 5 лет назад
recently a few months ago they made a reddit post talking to gamers about what thing they could add to their GPUs. A lot of people stated that Integer Scaling is the thing they needed and a few weeks later they came out saying that Integer Scaling would be a hardware implementation in their GPUs I think they still have gaming in mind but they might be focusing on compute as well, I'm getting hyped for the $699.99 beast
Далее
Intel is in serious trouble. ARM is the Future.
25:04
5090 could be 70% FASTER than the 4090
12:44
Просмотров 26 тыс.
💋🧠
00:38
Просмотров 65 тыс.
Intel have just made their BIGGEST MISTAKE yet
13:15
Просмотров 63 тыс.
Why Lunar Lake changes (almost) everything
19:09
Просмотров 93 тыс.
The FUTURE of graphics
15:47
Просмотров 35 тыс.
AMD's answer to RTX
17:34
Просмотров 451 тыс.
WE GOT INTEL'S PROTOTYPE GRAPHICS CARD!!
14:45
Просмотров 6 млн
Самый СТРАННЫЙ смартфон!
0:57
Просмотров 26 тыс.