@@gruuli no problem mate. If you're ever missing something, make it. It might be useful to someone else. I personally didn't expect more than 10 likes so this made my day! Such a small thing touched many.
Came for Ryzen and Radeon desktop and laptop announcements, stayed for their advancements in healthcare and aerospace. Staggered by how much more than just gaming AMD really are.
They are starting to become dominant in HPC and cloud. They will become even stronger when they release their cloud-optimized Epyc processors later this year.
Honestly, I think in a few years AMD will be renown as an enterprise company, rather than a consumer company. Lots of emphasis on embedded and data center. Keep up the good work AMD.
that's probably because that's where the money is. consumer hardware is like the minority of their sales despite common belief that PC gaming is popular
@@seibertron500 Yup. Higher margins. And I honestly don't blame them. They are great at consumer CPUs and mediocre at consumer GPU, but they are extraordinary with enterprise CPUs, accelerators, and adaptive SOCs. Gamers want AMD to compete with Nvidia so they can buy RTX cards cheaper, such disrespect to AMD imo, which has resulted in mediocre RX 7000 series GPUs.
The higher margins in enterprise can help fund the R&D for consumer products. Without AMD enterprise success the consumer market leader would always have unchallengeable economies of scale and vertical integration. They've been releasing mobile chips every year, able to put together both CPU/GPU & now the Xilinx AI tech into one package. These chips have generally been the best performing in 15w class, but without a disruptive advantage OEMs tend to stick with the market leader. It's hard to breakthrough in market share in the mass consumer markets. But leveraging tech across many areas is necessary, niches in computing always become evolutionary dead ends.
@@andrewszombie They probably will continue consumer products but they shouldn't put the majority of their effort into it when consumers only use AMD as a means to an end in order to buy cheaper Intel and Nvidia products. Margins in enterprise are way higher and AMD excels at HPC and enterprise partnerships.
Dr. Lisa Su is such a good and comfortable presenter, who clearly actually understands what the heck she is saying. Way better than most of the folks up there on the stage with her, some who clearly had no idea what the tech jargon they were spouting means. Kudos to her for her great leadership at AMD.
This is Chat GPT prediction gave me for Lisa Su speech: Hello everyone, It is my great pleasure to be here with you today to discuss the exciting developments happening at AMD. As you all know, our company has a long history of pushing the boundaries of technology and driving innovation in the industry. And I'm happy to say that we continue to do so today. Over the past year, we have made significant progress in a number of areas. One of the biggest areas of focus for us has been in the field of artificial intelligence and machine learning. We have made significant investments in this area and have seen tremendous results. Our new line of processors, designed specifically for AI and machine learning applications, have received rave reviews and are being used by some of the most advanced companies and research institutions in the world. But our focus on AI and machine learning is just one part of the story. We have also made significant strides in the field of graphics and gaming. Our latest graphics cards have set new standards for performance and are being used by gamers and content creators around the world. In addition to these areas, we have also made significant investments in the field of data centers and cloud computing. Our new server processors are helping companies of all sizes handle the increasing demands of data-intensive workloads. As we look to the future, we remain committed to driving innovation and pushing the boundaries of what is possible with technology. We are constantly seeking out new partnerships and opportunities to collaborate with other industry leaders. We believe that by working together, we can achieve even greater things and make a real difference in the world. Thank you all for being here today and for your continued support of AMD. I look forward to the exciting developments that are sure to come in the years ahead.
More people should be talking about MI300, a server SoC with CPU, GPU and AI-dedicated hardware all in the same package? That seems pretty huge and something that neither Intel nor Nvidia cal pull off right now.
@@ThePotato_ The DGX doesn't have Nvidia CPUs. They use CPUs from AMD and Intel in it. Intel does have plans for similar products as MI300 that they want to unite through OneAPI, but they are not quite there yet. I expect them to release a competing product in 2024. Their massive investment in chiplets shown with Intel MAX is an obvious hint in that direction where they are moving towards heterogenous compute modules.
@@LeonardTavast Also, Nvidia has not only the Grace chip and the Hopper chip, but the Grace Hopper superchip as well that combines regular CPU power plus ML power. I'd say that's along the lines of an MI300, but not sure which one is more powerful. Honestly I'd love to have a Ryzen 9 7950X3D workstation with enough PCIe slots to contain both an MI300 and a Grace Hopper and a future Intel with similar capability (and for good measure, and Alveo V70), then I could benchmark em all and figure out what mix of different vendors' technologies is a good fit for which types of applications in AI and HPC.
AMD's support for AI outside of huge servers is lacking. people doing varied/new AI feel better on NVidia. sometimes it doesn't matter who is faster/better. it's a painfully slow process to change the trend
Keep in mind AMD isn't just all about PC gaming and CPUs, they also have a huge presence on the enterprise/data center industry. Some of the stuff they might plan would be at later press conferences this year, one at Computex and one in Q3/Q4 of this year. I think we'll se FSR 3.0 and a possible Ryzen 8000 teaser later this year.
Yeah consumer CPUs are really just an afterthought, youtubers will say "AMD is in trouble" when really AMD is like a whole generation ahead compared to intel in data centre and it won't be until 2025 when intel will have a chance to come back.
I'm really happy you will be lowering the TDP in 3D cache series and started to follow the efficiency path again, it was after all one of the main sell points for me. Being more efficient than competition. Perusing the biggest number in an exponentially increasing problem in not a thing for me. I'm sold, I'm getting 7900x3D. Congrats and happy new year to you all
Not really. It SOUNDED great, but they did not SHOW us anything except a Cinebench test. The closest we got was the Dragonfly laptop, but they did not even bring it out. None of the awesome tech discussed was demonstrated. AMD has had better presentations before. After the RDNA 3 launch event, I just don’t respect the WORDS in AMD presentations anymore. And, the lack of SHOWING in this presentation makes me question how much AMD actually has here.
@@Austin1990 These are generally for investors and content creators, but are shown to the public. Most of the information you'd like to see presented has already been shown to those that this is geared towards. It's broadcasted to the public for, well, publicity.
I'm most interested in how the RDNA3 iGPU performs relative to the current RDNA2 680m in the 6800/6900 APUs. If we can finally hit 1650/1650 Super performance without a dedicated GPU I will absolutely buy a laptop.
One thing you'll notice is that the 7000 improvements are focused entirely on the CPU-side, as they increased the CPU cache. I think we'll get a GPU improvement on the Ryzen 8000 Strix APUs. I expect around 10-25% improvement with the extra cache.
Hey @@geeshta you got that half-right. I went back and checked my notes, which (after some squinting at the video) proved correct. True, at 43:15 we see that the HX suffix chips meant for the beefy (about $1500 list or greater) gaming laptops will only have RDNA 2 iGPUs. However, note at 23:40 where Lisa talks about the 7040 series CPU chips. These will have RDNA 3 iGPUs (plus the built-in AI engine). She didn't explain this or even really note it much if any. But I think I can guess why. In the beefy big-buck gaming laptops, virtually all models will incorporate a discrete GPU as well - either an Nvidia one or an AMD one (in AMD Advantage gaming laptops, for instance). It's a waste of chip real estate to put an RDNA 3 iGPU on these. At most they will route the dGPU output via these to the laptop screen (unless using external monitor or engaged MUX switch). Why make the customer pay for a half of an RDNA 3 that they won't really use? Instead, let them put their money into the dGPU. But on the cheaper laptops where the buyer has no option for a dGPU, the buyer is going to say "heck yeah I want an RDNA 3 iGPU!" Even if it's a chopped down one, it just has to be better than an Intel Xe graphics iGPU. I think this is a wickedly smart move by AMD.
Great presentation, super well executed CES event, kudos to AMD in their innovation and always trying to go further. I am very exited with their server and industry progress and integration into so many different aspects of our life. I am also super exited with their X3D offerings, hopefully I wil be able to replace my Zen 3 5900X platform into Zen 4 7000X3D, it looks to be a monster of a choice right now.
Awesome presentation, but I think AMD should also focus on RISC-V computing as that is a way more obvious future than Quantum computing. X86 will get phased out eventually and ARM is already a crowded market. Given RISC-V is significantly better than ARM and a lot better than X86, I would have loved to get some news that they were working on it.
@@canberrano1widowmaker He's saying its very unlikely AMD will out perform Intel as is typically the case for AMD vs Intel. Personally I prefer Intel because their CPUs perform significantly better especially in games.
AMD finally found their way back to their original name. Being a leading company in advanced micro devices with their aim at healthcare systems. Or should we call it advanced nano devices today? AND any objections? Love what you do for our species despite you always need to satisfy your stakeholders. Just keep Jerry Sanders' motto alive!
@@DragonOfTheMortalKombat I'm a midrange GPU user I dont care about the highend their OPs that's why I'm targeting the RX 7800 series. And if you're telling me Nvidia to check their "4080 12gb," nope.
@@DragonOfTheMortalKombat you're crazy if you think RX 7000 GPUs are finished. AMD can give Nvidia a real headache if they announce the 7800 XT for $699, but they missed the opportunity.
@@DragonOfTheMortalKombat Missed opportunity at launch? yes. However, prices can be dropped by either Nvidia or AMD. The question is - who will go first?
@@hknp the company that is making a technological progress in every aspect of human development has risen to the top since Lisa took the helm. i don't care if she's putting the milk first and then the cereals. its kinda because of her. so appreciate her for that. dumbass.
Exactly. I hated to hear that. I don't want Windows to be reinvented, I just want it to work the way I'm used to and don't drain performance. That's why I'm not moving to Win11.
@@Scisca1a2a And that's why you should know that change is inevitable, and eventually newer hardware won't even be supported on older operating systems.
I have to admit that last AMD mobile processor ryzen 9 7945hx is a true masterpiece..if it is combinded with 256 GBytes of ram ..it will be Magic...i will wait untill this dream comes true and will buy one beatiful Laptop like that..
25:39 I am confused she says “apples best” but the screen graphic shows M1 Pro… which I’m assuming is a m1 MacBook Pro? And not the M1 Max chip… which is kind of miss leading… isn’t m2 already out as well?
As CTO of a telecom manufacture in 2000 we designed a unique chip yet to be beat since the edge is where it’s at. The home gateway based on what we’re bringing to the forefront powerful edge and in transport reducing the computational significance of the cloud. We use AND at some points, but AWS, Azure and the rest will be simply commodities with marginal use in new distributed personalized experience.
If AMD adds a simplified processor(s), like a microcontroller, that purely handles IO for peripheral like USB, SATA video out backend, etc to their chiplet library, motherboards would essentially be chip sockets and simplified peripheral interfaces. nice :)
As far as I know, there is no competition, I hardly see any tech discussed.its only in amd and Nvidia videos, real tech talks happen. Others are some crap which only they can understand, not bothered
Are the some mistakes made in the 3D-Vcache slides? Have you understated the cache sizes for higher core count models? Assuming stacked cache size is same as 5800X stacked cache 64MB. Assuming CPU chiplet caches are same as with non 3D models the L2 cache is 1MB/core and L3 32MB. The 7800X3D total cache is ALL of those combined. But 7900X3D and 7950X3D both of them the number only match with stacked cached+L2 cache. But don't include the CPU-chiplet L3 cache. Personally, these don't matter as I cannot afford the upgrade but my 3900x serves me well enough. 7950x would be minimum reasonable upgrade from that and total cost of upgrade is out of my reach currently.
There is only one 3D stacked chache die in the 7900X3D and 7950X3D and a regular die, that is why the cache numbers look weird. Sorry to see AMD are gimping there top offering again. They really don't like winning do they.
@@cajampa That just makes operating system support critical for performance and limits its usefulness in scenarios where you benefit from more than 8 cores, due to having similar latency from the other chiplet as the main memory. On the other hand it could potentially keep the gaming benefits in systems that can be used for higher core-count loads also without having to pay double the 3D-vcache premium. But it would make that 12 core model has 6 cores that are fast for gaming and 6 cores that are slower for gaming, which could be a real issue. Maybe they really tested things and found out that the latency between chiplets such an issue that 3D vcache doesn't help if threads accessing data are not guaranteed to be on the same chiplet. So in that case they made a good call to force people upgrade their operating system to get any real benefit and limit the price increase for higher core-count models.
You couldn't be bothered to trim off the first 13 minutes of "starting soon" ???? 13:15 Or even better, the part where Dr. Su actually starts speaking 19:18
Really fascinating to see these advances in datacenter and other enterprise applications. Also really excited about the new 3d vcache chips, now I can finally go full top of the line AMD with the 7950x3D and the RX7900 XTX 😍 Also thanks a lot for the awesome Linux drivers, I wanted to switch to an AMD GPU for over a year because Nvidia is too stupid to write drivers correctly (there were some hilariously stupid mistakes in it)
Did Microsoft just leak Windows 12 at AMD’s CES 2023 Keynote. The next OS will borrow from Chrome OS and be a hybrid of local and cloud computing. AI will be deeply integrated in Windows and having local and cloud AI processing is important.
I love how AMD refuses to apply the big.LITTLE BS, especially on the desktop lineup. Windows and VMware Workstation still fails to understand every aspect of these heterogeneous architectures. I mean think about it: Who's job is to select between a p-core and an e-core on a system where the OS is running in Ring0 and the virtualization layer is command to use every core of the host in its VMs? It's a nightmare and indicates that my next system will be - again - an AMD one!
Let's face it folks: Apple went to heterogeneous big.LITTLE BS cuz it already had that in its phone chips (where it makes sense) and it's MacOS so they don't have to be Windows PC compatible and it's a way to advertise more cores, so it kinda makes sense for them; Intel went to heterogeneous big.LITTLE BS cuz it fell way behind AMD in raw technology, and the only way for it to stay in the game til it kinda caught up a bit was to be able to advertise more cores, taking advantage of the fact that the average customer doesn't know the difference between a big.LITTLE and a hole in the ground, and they knew they could beat Microsoft into submission such as to pump out Windows 11 that kinda/sorta makes a decision as to whether to run an app on a big core or a LITTLE core. Seems like the only reason to go to Win11 right now is if you have a big.LITTLE Intel CPU. Intel is selling you 3-years-ago's stale left-over cores just to try to keep you in the Intel camp. For what I primarily do, I want a homogeneous big.big architecture. Sadly, AMD will probably be pushed to a big.LITTLE architecture, and in their defense, for some application mixes it does make sense if you can fiure out how to do it right. But my guess is that Apple does it better than Windows 11 does. Also, to the extent that big.LITTLE might be sometimes useful, then shouldn't we also want to eventually have a verybig.kindaBig.SortaMiddling.prettySMALL.VERYLITTLE heterogeneous architecture ? Also, I wish Intel would come out with a LITTLE.LITTLE cheap CPU with truckloads of cores. AMD will probably do so with its Zen4c architecture, or at least I hope it will, so why shouldn't Intel do the same?
Wow. I kind of missed an Announcement regarding 3D V-Cache for the Threadripper lineup. I was hoping 3DVcache would come to their top-tier CPU "Threadripper" as well. Sure there was the top model 5995WX 64core introduced last time and maybe even a 96core CPU, but sadly I can't find it on amazon even now. Sure, there are new markets introduced which are going to change the world and surely are more important, but what about regular customers who want the best? Ryzen7 or Ryzen9 are great CPUs, but for those who want 64 cores or even more? I really hoped for the next Threadripper with, what I wished for would be, 128cores, 5+ GHz and a 3D V-Cache Chiplet Design. Maybe there will be a seperate event covering those kinds of CPUs. Still hoping! As for the coverage of scientific Products for Businesses, this video is really great and covers new Technology! Gaming? For the average Consumer yes, perfect. But for the High-End Enthusiast:" still waiting for the next Threadripper Generation announcement." ChatGPT: It's the hype of late 2022 and early 2023! Didn't know it took that much processing power to establish such a system. I thought a small group had developed it and now I realize how much processing power is needed for such an advanced system. Great Job AMD! Future Technology: Great insight in what's currently been developed. An akward moment was 31:30. That man was out of speech. But all in all it was a great presentation , just missing out on the Threadripper CPUs, unfortunately. I wish that this or next year is going to be the year of the first 128core CPU. Probably a chipled 3D design as well. My next gaming and VM machine is going to have a Threadripper CPU.
I feel like there was a little too much "telling us how fast the new CPUs are" and a little too little "showing us how fast the new CPUs are" 25:28 was grate. More of this, with more CPUs. (both cheap ones and expensive ones)
7900xtx only vapor chamber? Maybe not there is a lot of people manage to solve the problem changing the cable.... Hdmi or good quality display no problem gpu around 90 degrees
Really good keynote, and excited for new tech. Altho aerospace was cringe, but apart form that it was almost worth waking up at 3:30. It would have been 100% worth it, IF we'd have gotten 3d prices
Well Lisa Su said would be below competitors price, but yes, it was indeed vague. Yea, the last part with the aerospace lady was very weird. All I heard was, "bla bla bla first person of color on the moon, bla bla bla if women had walked on the moon 60 year ago, every one would have a different horizon today..." whatever that means and completely irrelevant for this keynote.
Thats great to know. Wonder if other companies can ever dream something like that because M is dedicated to madhavi, so they can neither go to moon nor Mars. That's why to feel a sense of achievement , people are made to cross, roam around madhavi , thats the best thing they can do
No worries about comp, they are stuck with some person anyways. I just saw some disgusting comments in their videos, their fiunders must be crying sitting in hell or heaven
This was so interesting! I wish we could have SEEN more of it. From the Dragonfly laptop to the AR in medical, I would have loved to see a demonstration of some of the technology instead of just hearing about it. We only saw a Cinebench run.
I feel they are flooding the market with CPUs, they just released a 7000 series now they're doing the 3D. Let's be realistic the 3D series is going to sell like hotcakes and leave the rest of the lineup in the dust. I believe it's coming out too soon but I believe it has to come out because the 7000 series are selling so poorly. The 3D is finally going to allow people to make the jump or want to jump to am5. It's being held back by ddr5 memory that still is a bit too expensive but finally dropping and motherboards need to drop also. I understand it has nothing to do with AMD with the motherboards and the ram but I believe that is what is stopping people from upgrading because you're asking people to jump ship for only a slight increase in speed. That was until the 3D. Definitely will be the best selling chips catering to both sides gaming and people who are creators. All hail V-cache 👍
In the next decade or more, the CPU would be insanely fast and hot! My Ryzen 9 7900x already run at 95 C whenever I use it for work. This trend can't continue forever, you can't make CPU runs at 150 C and call it normal. AMD should seriously consider direct-die cooling package for their CPU. It's like regular CPU but with their die exposed, IHS only needed to cover electronics and PCB outside the main silicon die
@@TheHighborn and also consumes power like a GPU. Yes I know that and it can't continue like that. AMD should think about better cooling solution, better efficiency, or make liquid cooler mandatory in 10 years
@@kaptenhiu5623 i'M not sure what's your cpu is, ('cos you wrote 9700, i assuem you're not on buldozer) but if you have a high end chip, you need high end cooling. That's how things worked. always.
That 52B KWH of savings of electricty really didn't compute with the audience.... nor many of the comments below. That is a huge amount especially considering the energy costs which rising drastically whilst fossil fuel companies make extortinate profits. We need alot more energy efficient components and devices with massively longer re-charge to re-charge times as that ulimately reduces ecosystem cost of electricity generation. Not to mention longer re-charge times means longer lasting batteries as their recharge lifecycle takes longer to reach thus reducing ewaste as well as reduced waste of critical rare earth materials and elements. Edit: Wait, wait, wait.... whaaaaaaaat 3D stacking of CPU and GPUs?!?! 😱🤯😱🤯😱🤯 and 128GB HBM3!? 😱
7600M is all you could muster? I mean, go smell the roses or something. I’ve been waiting for a proper successor to the 6700M, and all we get is more 8 GB of VRAM machines that had BETTER sell for dirt. I can’t believe N32 wasn’t announced, and hope that it will be very soon with very AFFORDABLE pricing to make up for the past month’s blunders.
Uhh... why in the f**k would they launch Navi 32 right now when Navi 31 still has a metric F**K-TON of problems??? 🤨 They need to ACTUALLY FIX their chiplet architecture as it currently exists before rolling out even MORE chiplet GPU products! Navi 31's performance is ALLLLLLL over the place, and the drivers still have a bunch of stuff that's broken or at least not working ideally. They NEED a better review cycle than the last one... Aka, it makes WAAAAAAY more sense to push Navi 32 back until it's ACTUALLY ready, and launch the 6nm monolithic parts (Navi 33 & eventually 34) right now/in the very near future.
AMD is beating Intel in CPUs in terms of value, efficiency and technological advancements, while also beating NVIDIA in delivering good value GPUs that doesn’t disappoint. Team red forever