The USA and Canada are not the whole world. In Europe, the prices are 25-50% higher. This CPU can easily cost $3000 here because of taxes and profit for the shops.
Wait a second, Canada doesn't cover the whole world? That's too damn bad because the world would be a much better place if it did... Legal weed for everyone!
@@lumymycalin6120 from chicken pox and then it will be a good males to get it out for the 3rd anniversary and I have no need for a new one for the day or any other night time to get the most of your dog out there anything 4:00
pumpuppthevolume its because their price is still affordable for High End PC Users. But this? $1999 for an 18C/ 36T Processor? Its way too much for them to handle, not everyone can own this thing so easily
The upshot is the new chip is TWICE THE PRICE for a 10% performance improvement over threadripper. That doesn't make sense to anybody. If you were running a render farm you spend the same money and buy twice as many threadripper systems.
Jays 2 cents got it to 5ghz on water and probably a delid Aswel but it depends on the radiator you use Ile use a single loop for each component like the you can have it's own pump and radiator like a 560mm monsta rad for better cooling sli is dead anyway so might Aswel overclock the you to its max potential with a rad that can handle 300watts of heat output
when you don't have complete control of the PC CPU market anymore and think a 1000$+ chip will still be bought even when there's a roughly equal chip for a chunk less
Actually I am disappointed on many reviewers that they haven't noticed there is a CHEAP priced 24core at 1000usd, or a slightly more expensive 32core EPYC CPU Or they just intentionally ignore that because it's server vs hedt…
Icywolfe If you buy a 32 core CPU but don't know which OSes and applications will actually utilise it then you'd simply be stupid. FWIW, both Windows Server 2016 (or lower, eg 2012) and the just-announced Windows 10 Pro for Workstations support 32 cores.
I think it's understandable to assume that not everybody will have the very rare full TR IHS base cooler. The circular one covers the quad dies at least anyways. However in the spirit of accurate testing, I think Linus should have used the Noctua NH U14S Air Coolers instead since they both have the normal one and the TR4 edition [TR4-SP3] to make the experiment much more fair.
Exactly what I wanted to point out, which Linus didn't in his "unbiased" review. Not a single mention of that fact, just rambling about "more cores = more performance" (duh!) and i7. What about power draw, killer price/performance ratio of Threadripper, no ECC support for these pseudo-enterprise Intels...? Nope. Intel, Intel, Intel. At this point, I'm getting tired of Linus' bias against AMD, which is totally unjustified when it comes to CPUs.
"Not the best" doesn't really convey the fact that 7980XE costs 100% more for only 10-15% more performance, doesn't it? "Not the best" is more than an understatement, it's blatantly ignoring horrible value of new Intels.
Its funny how intel has been optimizing power efficiency for years now. Me: I at least expect a really powersaving 18Core AMD: First gen TR is more efficient than Intel. Me: What theee ff
The people they hire are dead beats because working for LMG requires literally zero college. You get what you can take. Sometimes even homeless people.
@Jan Ryan and you think deadbeats and homeless people care about dental hygiene and would own an electric toothbrush? Get a clue dude ... some people are crazy with hygiene .. my sister carries a toothbrush in her purse. Its probably so he can brush his teeth after lunch like you are supposed to do but most do not.
I suppose the sensible buyer would save money by going to AMD getting near similar performance and buying a top of the line GPU with the money saved by not choosing Intel. Tho Intel is still king, AMD can not be ignored this time around.
moldoveanu8 Yeah people often look at the performance of the cpu alone when buying them, when they could buy a significantly cheaper AMD instead of Intel while losing only a few frame rates, but then, they can use that money they saved on other components like gpu that gives way higher frame rates.
That is where I was going with that. I have nothing against Intel... I have an older xeon for editing and an OC-ed i5 for gaming. But I would love to save space by getting similar awesome performance for editing and games in one package with Ryzen or Threadripper for a bergain $$$ price. Intel is just too expensive to get top tier gear every time, I just can't afford that i9 for a bit more performance when AMD offers plenty on a bargain. And I do love an underdog. So next time I can support them, I will. Just not on their Vega GPU...that was a letdown lol.
Ryzen 7 1800x is not faster than an i7-7700k in most games lol But even if it was, you could improve your argument by using an overclocked 1700 instead, since 1700 OC and 1800x are almost identical in performance.
I don't know. Many workstation applications still greatly depend on single-thread performance. This is especially true for things like 3D animation and CAD. And, in many cases, using more than 12 cores would be very difficult. The i9-7900X seems like a good competitor to the Threadripper 1950X in this respect. But, since Threadripper supports ECC on supporting motherboards, like from ASRock, it is a true workstation platform where X299 is just a high end consumer platform. Who would build a +$3000 "workstation" that does not even have ECC memory?
Redhead Knight If you overclocked a Xeon x5675 on the old lga 1366 platform high enough you can beat an r5-1600 while still being able to use ECC, and you gain a third channel of ddr3 ram so it'll actually beat a dual-channel ddr4 setup as well, something to think about and cheaper than threadripper/Ryzen/x299 overall...
David Johnson Overclocking, especially such old parts, defeats the stability and reliability aspects of a _workstation._ Then, there is the issue of I/O: USB 2.0 and PCIe 2.0. Lastly, where are you getting that it beats a Ryzen 5 1600? With both at stock, the Ryzen 5 1600 is much faster. When both are OC'ed, the Ryzen is still faster. cpu.userbenchmark.com/Compare/Intel-Xeon-X5650-vs-AMD-Ryzen-5-1600/m355vs3919 Moreover, the Ryzen 5 1600 can run ECC RAM as well on a board like the ASRock X370 Taichi. So, a Ryzen workstation does not require a Ryzen Threadripper. It depends how much power you need. Of course, such a Ryzen system would be much more expensive than an old X58 system, especially with ECC DDR3 so cheap. But, a Ryzen workstation is new hardware, has modern I/O, and doesn't need overclocking for good performance.
Austin P You do realize that the 5820k is capable of beating the r5-1600 when overclocked, and the x5675@4.7 GHz will match the 5820k stock which isn't far off from an r5-1600 stock. Add to that between the right cooling and lapping process you can sustain reliable operation even while overclocking as evidenced by the low temps the LGA 1366 Xeons can maintain particularly the 95 watt parts. Furthermore it's reasonable to conclude since both the x5675 and likewise 5820k both overclock far higher than Ryzen as a whole can...well you can see how the r5-1600 is at a severe disadvantage to the old lga 1366 Xeon. The third memory channel that LGA 1366 provides also hurts the case for the "new" Ryzen that only has dual-channel. Honestly the only viable upgrade is threadripper which truly shines even against Intel's current best offerings. For context my x5675@4.17 GHz scored a passmark of 12,752 which beats an i7-6700k, and at 4.5 GHz my old xeon is scoring well over 13,500 putting it squarely in the r5-1600 score range. The fact I still have additional overclocking headroom as I can still safely get up to 5 GHz means that there's no way even a 4.2 GHz overclocked r5-1600 can surpass my 7 year old 6-core Xeon...
Me too. Mine has been under water for years. Along with my r9 290x (but that's getting a little limiting for gaming now). Lately, I haven't been gaming much, but have been using the machine for software development. Compiles and runs unit tests in a fraction of the time my co-worker's ultrabooks and Yogas take. And doesn't overheat like their machines while driving 3 monitors.
anghellic correct me if I’m wrong but intel are losing now, here, today. In their HEDT for the reasons stated above. That’s before EPYC in the server markets as well. Intel are only just holding on to the single threaded performance. How long do you think they will hold onto that for?
Batman It is overkill for gaming builds, but its features allows a Threadripper build to be a server, including ECC RAM usually reserved for Intel Xeon builds. If not server, it can do production or multi-thread simulations much better. Still needs workstation graphics card if you want precise CADs.
Also worth noting that Threadripper still offers 20 more PCIe lanes over the Core i9, so, if you need expandability over pure performance it's still a better deal (and with that sweet price)
So in other words guys Threadripper 1950x Is still the best High end cpu to buy atm. at $999 vs the $2000 cpu here, threadripper is better value and keeps up Very well for the price gap, You could buy a 1950x A 1080Ti and half way to a rog mobo....... seems like a no brainer
Noor Sholichin 6th gen i7 with RX 480 before the miner shortage. My build is practically an odd couple. My wishlist build is AMD TR with NVIDIA GTX, making it a yellow team build.
It sometimes happens when you overclock Intel's processors (Team Blue). They get too hot and burst, releasing blue CPU liquid. AMD's processor's liquid is Red or Orange (newer models). ( ͡° ͜ʖ ͡°)
Jinxi Cheng These arent for people who care about price as much as us. The people who buy i9 are companies who trust the manufacturer over amd (who had only just now got competitive) and are willing to pay that much for the increase.
CZ PC, those people who don't care about price are serious users who would also expect ECC memory and more PCIe lanes.... AMD has them, but Intel says nope..
CZ PC companies probably wouldn't even buy x299 since it doesnt support ECC memory, im talking about big companies not a personal content creator. Also since the i9 cost twice as much, the company will probably consider that they can get their work done faster with 10 unit of PC powered by Threadripper CPU rather than 5 unit of PC powered by the i9 CPU.
Niko Nikolov that so wrong for content creator, you should use raid for data redundancy without sacrificing storage. Unless your project file is trash aka disposable.
I have a feeling some people are going to comment that a Teraflop of computing power is barley anything because "GPUs have several Teraflops of compute power!!111" despite them knowing that modern GPUs have at the very least, about 750 stream processors. (And that's just about the power of an RX 460, which has already been replaced.)
SlayerBRK Though Intel didn't back off from charging extra for full RAID functionality, or even let their high end get some ECC memory support for more sensitive tasks...
Threadripper still is a better buy. Uses less power and is insanely cheaper. That means you can invest money in other places. That power draw is going to cost you alot in electricity. X299 is a waste of time.
So basically your spending an extra $1200 for an approx 8% speed increase over the i7 7700k before overclocking. Even for video editors that is not worth it. Especially since in 4k video editing since adobe only gets 1% faster per core after 6 cores and only 3% faster per core on 1080p videos.
Preliminary tests shows that it does, except Anandtech made the mistake of doing database testing with databases small enough to fit into the L3 cache of both the EPYC and Xeon cpus.
The problem is that Epyc has lower clock speeds than you would want in a workstation. Workstation applications are still largely dependent on single-thread performance, like 3D animation and CAD. In addition, they need many cores for a few highly threaded tasks or for multitasking. These workloads sit between a consumer PC and a server. The question is how many cores could be utilized effectively. It depends upon the person, but I am guessing that loading more than 12 cores would be very rare. The i9-7900X may be a sweet spot for most workstations. Rendering workstations would likely benefit more from using Radeon Pro SSG or Radeon Pro WX 9100 graphics cards for rendering rather than the CPU, but this brings up the issue of limited I/O on X299. This makes Threadripper or lower core-count Epyc CPUs look tempting, but animators really want the extra single-thread performance. This is especially unfortunate for Epyc. Now, if you could overclock Epyc...
This is a HEDT platform comparison -- no crazy expensive ECC RAM necessary here. Different markets. That is an interesting point though -- AMD's *cheaper* but *faster server part can beat Intel's *pricier* *slower* high-end desktop part.
If you need 1 CPU with as many cores as possible, you could go for Epyc 32 core. I think it's only 2 or 3 grand for the version that only supports one CPU per motherboard and forget the slight loss in clock speeds and slightly less single core. YOU HAVE 32 CORES. Epyc is epic
Whenever a new CPU is released, Intel sends the processor in advance to Linus. BUT, there is a period where they can't post any video about it nor talk about it. And that period is called the embargo period. That is why when you'll watch some WAN show, specifically the one with JayzTwoCents where they talked about this, you would hear them saying "We're just waiting for the embargo to be lifted". When an embargo is lifted, Intel usually asks them to upload on a ludicrous deadline, like say "it has to be uploaded before 6AM the next day", or "it has to be uploaded by midnight", or something. That way when they release the product, everyone has access to a video as soon as possible. The video, like this one, would often contain everything the consumers need to know. Lastly, its not only the ludicrous deadlines that Linus and other reviewers had hated about the embargo as they mentioned in the WAN show before. Some manufacturers would use the embargo to restrict your opinion as well about the product. It varies from one company to another, but (sadly) some uses this to somewhat force RU-vidrs to spit out a controlled review. They don't want as much negative reviews as possible, especially that embargos are lifted the night before or on the morning of the release of a product. The least thing a company wants is a bunch of negative reviews even before the product is released. Linus got mad before because I think it was Intel? or AMD? who told everyone a last minute change with the embargo video. Of course since its a tightly-scheduled video, editors like Taran had to stay longer at the office just to update all the info while still making it on time.
What it piss me off are comments from people that are just fanboys and they will never spend money on a HEDT CPU and they have complete ignorance of the reality making ridiculous comments. Intel has lost their mind with the pricing. No one will pay 80% to 100% more for Intel 7960 and 7980 in comparison to AMD 1950. Cost/performance is critical factor in every business scenario. All the other things are bshit. Even for rendering it is more sensible to build a 2nd 1950x system with cheaper parts for some extra cash for expanding to 32cores and using it as a render node. I hope Intel to change their strategy because with that pricing they will lose all content creators.
If a hobbit is 3 and a half feet tall, a Pygmy is about 85% the size of a normal human and a premature human baby is around 80% of a full term baby, how big would a premature Pygmy hobbit baby be? I just need to know...
So the latest strongest consumer graphics cards are about 6 times more powerful in raw computing terms if compared to the latest strongest consumer CPUs. With the major difference that graphics cards do vectorial computing while CPUs do scalar computing.
Why didn't linus say the money is not much of a issue then says about overclocking to turn it into a beast? Over clocking sometimes is more hassle. It's way over priced for a tiny gain not 20% gain more like 10%-15% gain. I love Intel but but i gotta say AMD are doing better here since they released it before Intel. I have a feeling AMD are holding back
I do 3D animation which involves rendering- I'd rather have 2 Threadripper 1950x's for for $2000 than 1 $2000 Core i9 7960X..........Looking at cinebench- things would be allot better going that route. Price-point would be about the same for 2 systems versus me going towards a single i9 renderbox
If you overclocked a Xeon x5675 on the old lga 1366 platform high enough you can beat an r5-1600 while still being able to use ECC, and you gain a third channel of ddr3 ram so it'll actually beat a dual-channel ddr4 setup as well, something to think about and cheaper than threadripper/Ryzen/x299 overall...
Frankly there isn't much of an argument for these. If they were a few hundred more than threadripper I could defend intel but this is just silly. You can't just charge whatever you want when you have direct serious competition. I guess if you just wanna throw twice as much money for a completely not even close to proportional increase of performance by all means but It's not worth recommending. Perhaps next gen they wont be so damn lazy and finally actually give us effort. Lets be real though most businesses DO care about the price it's the techs buying the machines who don't. So it's likely these things will still sell because the unfortunate business owner has a "tech guy" who doesn't give a crap about the businesses bottom line.
Nicolas Madera they have two additional, theoretically functional 8 core dies. But they are most likely not even connected to the other dies/ the mainboard. So Nuclear Cactus idea is unlikely to the point it could be called impossible, but, from what is known today (as far as i know) can not completely be ruled out as not physically possible.
The technology isn't there yet. You can't get more than a few gigabytes of CPU cores at a time unless you download some extra ram which would take several years on a standard broadband dial up connection.
Slayer2316 Isn't Ryzen ipc right on with Broadwell E? Granted Broadwell could hit 4.3 on all 10 cores overclocked at best, drawing 240w. Threadripper is pretty damn close in single core perf considering two cores can boost to 4.2 and 4ghz all core boost. Threadripper draws 268w with 6 more cores full load.