I hope some shortsighted executive doesn't kill it because it hurts their bonus. Catching up to their competitors will take time and enough stakeholders inside Intel to keep it alive until it starts gaining some market share.
I love Linus and his team for striking a delicate balance between product review integrity and accepting sponsorship money, but honesty, as hard as they had been on the ARC, this was LTT pulling their punches. The reason Alex would even spent days with Intel engineers to debug this IS BECAUSE of the boat load of money they spent on Extreme Tech Upgrades and flying Linus to Israel to visit its latest chip fab. That kind of access will make you think twice before giving the ARC the dunking it deserves. Let’s put it another way: If Apple had botched the launch of M1 to this degree I’m sure the slam dunking on them would be waaaaaay more savage.
@@TimBoundy and give up on the RU-vid ad revenue? Nah I think they mostly hit the right balance, I just saw waaaaaay too many comments giving them too much credit for their (mostly honest) review and wanted to pump the breaks on the whole thing
That was still a very forgiving review IMO, to compare when AMD released Ryzen with all the issues you need to consider they were a company at the brink of bankruptcy but Intel on the other hand has tons of cash burn heck they could have sent units to beta testers all around the world to help identify the bugs so that they can iron it out before the release.
They have seeded the units to less public figures and they have their own QA, they have been aware that the situation is to put it mildly shit for at least half a year now. Testing possibly pushed back due to production issues and delay in silicon, and then you can't just grow a driver team, there aren't just hundreds of jobless driver specialists floating about, it will take them time regardless of their financial might to shake this one out. There just isn't a way to rush things. But yeah this will be a product line to avoid for who knows how long to come, could be months, could be years. Corporate bean counters will still end up buying them though when business divisions request a machine with a GPU, Intel has a sort of push there, connections; and people who will end up being frustrated at their job due to them are not our problem.
Very interested to see how it performs on Linux. Unlike the sorry state of Intel/AMD windows drivers, the Mesa graphics stack was always very performant for me.
This somehow reminds me of the moment when I bought my current laptop. It has the 11th generation of Intel CPU and it was the first that featured the integrated Iris Xe Graphics. It was horrible. I couldn't play a video on yt for more than 30 minutes because it simply crashed every single time. Fortunately the problem is gone but I don't believe that things like that should happen
I got an hp laptop that could swap between the gpu and integrated graphics back when that was a newish thing and the feature was broken for like a full year because it was automatic and it didn't detect half my games even if I added them manually to the detection list. I sure did love playing minecraft on super low settings on my expensive laptop.
Is it possible to get a #short of Anthony doing something with an Arc GPU in Linux? This may sound like absolute insanity, but Intel's drivers were usually quite solid in that space for a long time, and Linux may actually be the user friendly way to use an Arc GPU
If Intel doesn't improve on this shambolic showing, then there goes the hopes of everyone who wanted more competition than just the Green-Red duopoly all the time.
This is so incredibly disappointing to see, but at the same time, I'm still hopeful. I desperately wanted ARC to be good to help break the duopoly. I'll give it time though, like you said, Ryzen needed time as well!
I don't expect much. Iris XE drivers are STILL complete garbage, so is their Command Center. They had years to fix this ahead of release and did nothing. And here the infamous Intel lazyness couldn't be excused like on their CPUs, because they're not ahead in any way for GPUs..
It will, just a matter of time. Intel to is too huge, too advanced age too vested to let it fail. It will compete with Nvidia and make AMD obsolete entirely
Failing at early drivers is obvious, even AMD not so long ago had issues with stability. Now everyone who bought one of their RDNA 1 or 2 GPUs has way more performance than when they got them !
Intel should perhaps have launched this 1st generation as a loss leader, make it the bang-for-buck choice without the performance or stability. Then moved prices closer to parity when the next gen arrives and things are on a more solid footing
The truly hilarious part is: had they said their first generation was aiming for stability and extremely low performance, as either a replacement for Iris or maybe something to augment really low-powered laptops for education/business these would have done so, so much better. All it would take is a little transparency
Just to doubledown and back up had they just talked about the first launch card. If they would have given us a realistic expectation for this card's performance Well, I bet there are enthusiasts who would've looked for a reason to buy one for their mother/father/sister/uncle/brother just to get their hands on it
I’m not sure how much budget intel has to take a risk like that, what with all of the reinvesting into more production and the relatively recent break of intel as the dominant CPU supplier
The problem with that theory is that there integrated GPUs sucked so much and scaled so badly that they needed to rebuild the Archetextur from zero. If you look up the EU count increases over the years and the performance you actually got out of that you don't have to wonder why..
I'm glad they're failing. The biggest half of the duopoly in CPU space for the past 2-3 decades successfully entering GPU space is a loss for everyone in computing space. The answer to too little competition isn't to allow one of the biggest companies in computing to begin branching into another, related market, where they can start trying to crush more competition. All the big tech corporations should be broken up and forced to compete with each other.
They don't have the experience building actual powerful GPUs and their drivers, so they can't keep pace with AMD/nVidia, both in product quality and in iteration time. Maybe after a few iterations they will keep up, and I really hope they do because the GPU space needs some more competition.
Right now it seems they have more drivers/software issues than hardware. Sure their hardware isn't the best but if the low spec unpolished ARC in this performed like this then the hardware isn't that bad. The higher models will really give some decent ability to the user to play games and rock in other tasks. Intel just needs to improve their drivers and bug issues etc.
I don't think most people were asking Intel to produce a 3080 or 6800XT equivalent for the masses, but not even having a 3050 performance card working properly is underwhelming to say the least...
Windows rolling back drivers is not an Intel exclusive problem. For my AMD integrated GPU, it by default installs drivers from 09/2020. It's probably just Windows being Windows.
Seems to be a big issue with laptops. My Dell G5S is always trying to install old AMD GPU drivers via Dell Update and my Surface Book 3 is always trying to install old Intel & Nvidia drivers via Windows Update. I wish they'd just check the currently installed version before trying to install an 'update'.
I've had to manually disable Windows Update for eternity because of this reason. It literally installs drivers that completely disables Vulkan on my laptop.
Software hiccups were rather expected considering intel's track record on gpu drivers. The price is a big issue though. I understand they want to make the R&D back but selling it way above 3050 is insane.
@@randomnobody660 They mean the mobile version, some OEMs have released refreshes that replace the RTX 3050 with an Arc 370m at the same or higher prices rather than sell it as a cheaper option, though it is not clear if Intel is at fault for such moves.
I feel bad for the engineers here. It feels like they were pushed to hard by the marketing people to launch their product before it was ready. Don't forget, it's really hard to make a good gpu.
I'm impressed they were able to get this out the door. I'm still curious though, given Intel's been making integrated graphics for years, why was ARC such a rough launch? Seems like they should have been able to leverage their existing software architecture but, given how crotchety software development gets over years, it probably just wasn't easy to adapt for a new product. They should be able to sort it out eventually, but it'll probably be another year.
I don't think it was marketing's fault, since intel's new CEO is ACTUALLY AN ENGINEER! So their new CEO should have had TOTAL control over this situation, and he should have told the marketing team that they're not releasing anything that isn't ready. I mean I think the responsibility rests entirely on his shoulders here - he absolutely must have known that it was riddled with problems and he made the decision to give it the green light anyway because he felt they had to for various reasons. He absolutely must have been very knowledgeable of what was going on, and he made the final call to launch anyway.
@@ericalorraine7943lookup Priscilla Dearmin-Turner, this is her name online, she's the real investment prodigy since the crash and have help me recovered my loses
Although none of this is surprising, what they should've done first was reduce their prices across the board. If you need as much 'early adopters' in the door, you need to incentivize it by at least offering it at a price no one else can compete with. When you do things THIS way, it's a universal psychological experience that every consumer will only logically feel at a loss because you offered less, for a higher price. Then further compel them to justify it as an overpriced & underperforming purchase, objectively. It's worse too when the product provides such an inconsistent experience, because it removes the native desire to compare things.
If they reduced the price the more people will buy and more will get frustrated and lose the love for intel so its an smart move to increase the price to see how arc perform and increase it in software then they can reduce the price for real people who will love it.
You are right Intel should give this shyt away like they used to do.. They used to give Dell Gateway and HP Intel CPU's for free, Just so they wouldn't use AMD.. They are going to need to give these away and about another 2 years of driver improvements....
@@Scolar69 It worked out fine for AMD. Ryzen started out quite buggy and finicky but cheaper than Intel and it ended up becoming quite successful. If people were going to lose their love for Intel over something they've done, there have been far better reasons in the past.
@@Okusar Ryzen was no where near this Bad.. Their was a few BIOS flashes that we had to do and the rest was mostly The Windows scheduler that needed to be fixed for Ryzen to work well once they put out a couple BIOS Upgrades.. I was a day 1 Zen 1 Adaptor, And it was nothing like what going on with Intel GPU's right now.. The laptops are locking up and crashing in the middle of not even using it.. There is a huge difference between ARC and Zen 1.. I agree to disagree no offense..
Honestly they had to ship something, this is EXACTLY what I was expecting. That said what matters to me, is encoding and power efficiency. So I'm still super hopeful as a result. Really I think Intel is 2 or 3 years from a being a true competitor for Nvidia in the media space.
I might be way off what happens in reality, but to me, I thought any outfit outside of AMD/Nvidia that brought anything to the table to 'compete' in terms of a GPU or on-die system, wouldn't just have to match performance, but would have had to be almost bug-free on consumer release... In general, anyone thinking of upgrading would automatically be worried about anything outside of AMD/Nvidia; will it be supported? Will it have compatible drivers? Will it compete? Honestly, I would look at any 3 GPU products, for example, AMD, Nvidia, and (whatever company), and even if the 3rd product was cheaper, and immediately was seemingly performing and being as reliable in every way, and just the thought of it being a newcomer to the market would put me off even at significantly cheaper prices. To conclude, I'm glad there's a new entry to the field, I'm also glad that it's Intel because that makes it seem like a company with the pedigree and finances to push on and support/update/upgrade the product, and also produce new models going forward. Hopefully whether or not Intel brings out a GPU that even matches Nvidia/AMD, just there being a 3rd hat in the ring will be good for consumers, and maybe one or more companies will then follow.
@@droplifter3435 Intel is very much a company used to operating on the strength of its brand. If you're thinking about the generally uninformed consumer, they wouldn't really think twice about an Intel logo on a GPU product. They would have to be told that its bad. It's been easy to acquire an AMD GPU for some time now, but they have been sitting on shelves. I've actually genuinely wanted an all AMD laptop, but they've been basically unavailable. So this Intel mobile dGPU will probably still sell plenty by default, but I don't expect them to be moving many desktop units anytime soon.
top that off with that when Ryzen first came out it beat EVERYTHING intel had in multicore performance. imagine if ARC came out and beat everything had in raw performance. Its really a poor comparison.
Honestly I have the A370M right now in my ASUS Zenbook that I just bought and I think it performs super well. No glitches or other problems described in this video. I guess things have been ironed out since.
So Intel messed up the software side of things and now they really have to get this back on track fast before everyone associates their GPUs with bad performance and constant crashes. Getting beat by their own iGPUs in gaming performance is also a big oof, even if the focus of this isn't gaming.
To be fair it was ryzen iGPU, but that's still an oof. At this point why bother buying ARC when you can skip dGPU, spend more on a good ryzen chip and get more CPU performance with same tier of GPU.
PSA: Windows disables/rolls back GPU drivers as a security feature (or something of the sort, big oof). You can disable this behavior by going to: Control Panel > System & Security > System > Advanced System Settings (right side tab) > Hardware > Device Installation Settings. Then choose "No (your device might not work as expected)." I NEVER recommend you let windows update specific device drivers, go to the manufacturer support page for best stability
I'm just excited for a second or third generation. Having less and less time to game, I'm mostly concerned about image and video rendering, and soon enough I may also do heavier compute stuff for my Physics degree
It just goes to show you how complicated all of this is. Intel's software team was already larger than the entirety of AMD as a company, and this was not only pre-Arc days but pre-hybrid core days. Intel is fighting on multiple new fronts here; it's not just GPUs; it's Hybrid architectures; it's opening their platform for third parties to use their fabs. I'm impressed they got to a point where they could even ship something because I don't think it is humanly possible to expend at the rate they need to to get this all done. It will take some time, but I believe Intel can pull this off.
Imagine how AMD felt after acquiring ATI. Suddenly you go from focusing solely on CPUs and now you've got GPUs to deal with. Then you realize, oh, shit, we provide GPUs for consoles too, so now we have to keep trying to maintain relevance in those markets. Then they end up making APUs for the PS4 and XBone, along with their own CPUs, APUs, and GPUs. The entire time that has been going on, they're struggling to reach performance parity with their competitors because they haven't really had that since the 4000 series GPUs and the Athlon 64 X2. Intel may be spread quite thin, but they're a big enough company to cope with that. AMD is markedly smaller and they still managed to surprise their competitors with their current and upcoming product lines to the point that Intel had to release a supremely well-binned SKU. nVidia still maintains dominance for the most part, but I'm guessing that they hadn't intended to release the RTX4000 series in quite the way that they will be. They had probably hoped to be able to hold onto the launch of their 4000 series for at least another two quarters, but AMD is pushing their schedule forward a bit. It's interesting to see the effect that AMD has on their competitors in the market, particularly given how comparatively small it is as a company.
@@srilemobitelsrile8809 of course it's smaller, but that doesn't make the software stack any simpler. The 3090 is much larger than a 3050 but it's still essentially the same driver and auxilliary software.
Intel has always had video driver issues, going back as far as I can remember. A fair number of us even said as much when we learned Intel was trying this again. Drivers is what doomed them last time and they better step it up or this will be another fail. Their integrated graphics drivers have always been just adequate so not sure why anybody is surprised by this.
@@leovang3425 That is never the case unless the iGPU either just straight up doesn't have enough performance or doesn't have necessary features. I'm talking about games like Fallout 3 or the original Witcher. Both games will run on an iGPU, they just refuse to do so out of the box and require a bypass
@@CanIHasThisName I've had a similar one to LTT: Where Windows 10 will keep installing a non working driver, on a Windows Vista laptop. And yeah I know, we're going beyond "supported" age here, but the laptop runs fine, if Windows update just wouldn't roll back the damn driver and make the laptop screen go black every time...
Their Linux drivers are much better though. Not that most users are going to use Linux but they have better and more stable ones already. Why not try and get it working for Windows ?
i remember my first AMD (at that time it was still ATI) graphics card, it was a 4MB ATI 3D Rage Pro and it was the worst decision i ever made regarding pc hardware... literally *everyone* else i knew back then opted for the 4MB Matrox Mystique card and left me in the dust while paying less. ATI released a "turbo" driver *one* year later, that increased the speed about 30%. it was still far behind the mystique's capabilities but it showed how unoptimized this card really was at release.
I had mine much later, with good performance. I gave ATI a chance with the Radeon 9600XT. Performance was fine, but it was so buggy and unstable that I replaced it with a GeForce FX 5900 XT, which was more expensive, acted as a small space heater, *and* was slower.
The Rage drivers originally weren't even able to render all polygons in ZD WinBench 98, and the game compatibility was similarly spotty, and it dragged on for years upon years until they figured it out. It's an unfortunately aptly named lineup of cards, for the feeling you may experience towards them. We used them in a GPU distance throwing competition at an Nvidia sponsored event some years later. (The competition was not endorsed by Nvidia, they weren't even aware until some time after)
@@1pcfred Yeah. I was wondering about that. Also, doesn't he mean 4GB? If we're talking only 4MB of GPU RAM then these must be from 30+ years ago. EDIT: OK, I see now that the 4MB in the name isn't representative of the RAM. These actually have 512MB of RAM.
@@redpillsatori3020 I didn't get as far as 4 MB or 512 MB but even 512 MB would be extremely low VRAM today. I would think 4 GB would be a minimum amount anymore.
Intel XE driver issues were the warning shot. If anyone tried to game on XE, like I did on my Framework Laptop, you've likely experienced issues. If Intel wants to do high-end graphics, they need to step up their driver game IMMENSELY.
Absolutely. They have been neglecting their drivers for years, the new architecture excuse doesn't stand. Intel had more than enough time to work on their drivers, Xe is still pitiful after almost 2 years yet they are marketing it as if it's something totally working and revolutionary. It's absurd how their mobile Xe is so powerful on paper, yet it barely manages to stand their ground against Vega thanks to how messy their drivers are.
@@luisortega8085 mesa is great, better performance in my test (modded MC 1.7.10) it performed better than my friends 5700xt (he was on windows), 80-110fps vs 50-70, we are still trying to figure it out
The software isn't that new, they've been dealing with GPU drivers for decades. People forget that Intel is the market share leader in GPUs due to their iGPUs.
I thought years back that an engineer at Intel said that they didn't want to make dedicated GPUs because getting drivers right was a nightmare. Perhaps it was someone else saying something about it, I'm not able to find any articles, so it might be something that no one ever said and I'm just imagining things. Hopefully they're able to iron things out with their drivers somewhat quickly, but it's going to be a long time before they reach maturity.
Thanks for the encoder information, most gaming channels ignore encoding. I would love to see the Intel GPU with hardware encoding using FFmpeg since so much GUI software is literally just an interface to FFmpeg.
But on the flip side, all hardware encoders end up inferior quality per bitrate to software, so they're only really useful for intermediate encodes not final renders - if you care about quality.
@@alexatkin It annoys me to no end that that never gets mentioned. Sure, the quality is often good enough. But the quality IS still different and that needs to be mentioned. RU-vidrs tend to talk about it like it's just the same encoding but faster because it's accelerated
@@kendokaaa Also as they did mention in passing, it has limitations on resolutions and usually omitting some specific features of the codec. Especially relevant as newer codecs are to save space, so why use the latest hardware codec if its the same quality as a good software encoded H264? (Just an example, don't know if it's quite that dramatic)
@@kushagraN be it using Software Encoding or Hardware Accelerated Encoding, your video quality will still be Butchered if you upload it to RU-vid. Because RU-vid is using VP9, AV1 & AVC H264. You can only tell the difference if you Encode that video for yourself only. But once you upload it to RU-vid, the quality will degrade no matter how beautiful your original video was.
--- Video Idea --- Could we get an updated video for 2022 of your "3D Modeling & Design - Do you REALLY need a Xeon and Quadro??" video. A cheap computer for 3D CAD modeling.
Damn, I was actually really looking forward to Intel’s GPUs. Hopefully they can step it up with future hardware. As Alex pointed out, we need more viable competition in the GPU space.
The surprising part of the news that Intel's GPU software/firmware has issues? That it actually ran as well as it did. No, seriously. As someone who wrote 3D visualization software for decades, the chaos of unreproducible, seemingly nonsense, bug reports that in no way improved with code hardening were by far and away mostly from customers that used Intel graphics hardware. The rest were Matrox graphics hardware in the early years, and VMs - VirtualBox especially - in the later years. Intel + graphics = Bad Things. And now they're making discreet GPUs of that mess? They've failed for decades, why Git Gud now?
Honestly, I think Intel's plan to focus on other GPU-bound tasks than gaming is pretty smart, especially for mobile devices. So the Handbrake numbers give me at least a little bit of hope that Intel GPUs might really be a valid alternative in the future for video editing or rendering.
excited about that AV1 hardware encoder, i hope it can do film grain sinth or at least allow to tune arnr strenght and maxframes, if it does not support hdr or 10bit it's useless garbage
The fact that it is remotely capable of mobile gaming is already a decent sign of things to come. Now they just need to get their actual power figures to work out the way they intended them. A lot of people who want a thin&light laptop with dedicated graphics aren't necessarily planning to game heavily on it in the first place, but they do want it to offer them rock-steady video playback, OS UI graphics performance, and acceleration for any GPU-accelerated tasks that they might need to do on the go. The fact that it can game to any reasonably competent degree is a real bonus. Hopefully the higher tiered Arc GPUs will be comparable to, like, maybe a 4060? That'd be a good target for them. If they can get that kind of performance at a similar power draw in gaming, but beat it out in encoding and other GPU accelerated tasks then it stands a decent chance to claim relevance. Unfortunately for us impatient viewers, only time will tell.
Windows update rolling back intel GPU drivers is a stupid Windows update bug that's been around for like 5 years already, and it's extremely infuriating. It also does the same with integrated AMD drivers
Re the 1st-gen Ryzen comparison, sure it was buggy and undercooked, but those older Ryzen chips were still astonishingly competitive, and completely smoked Intel's then current CPU lineup in multithreaded applications. The problem here is that, well, Arc just isn't competitive in any way, shape or form - the fact it was being beaten in multiple tests by AMD's *integrated* chips was damning. Intel need some kind of strong selling point, and currently they don't have one. I feel like this is going to go the way of Larrabee / Xeon Phi - Intel will realise they don't have a viable mass-market product, re-brand as a niche co-processor they can sell at high margins to specialist markets (AI acceleration, supercomputing, video encode/decode) and then quietly kill it off when their market segmentation bet doesn't pay off either.
Intel also greatly needs more people to help find bugs for Arc. This is a new branch for them that they expanding to. I can only expect Intel to make massive fixes to all the bugs.
I don't buy the "can't test for all the bugs" aspect. This is Intel. This is a GPU that has obviously put productivity ahead of gaming and they failed to test if HANDBRAKE even ran? My come away: They have been chasing problems and creating problems as they fix others.. a loop of issues. So they just decided to send it and hope for the best. Heck, maybe the community will actually figure out some fixes for them. Regarding the Ryzen analogy... correct me if I'm wrong, but even though Ryzen had some bugs it also offered much better bang for the buck against Intel's CPU at the time. This Intel GPU seems to be worse, more expensive, more power hungry, and also buggy to the point of being unusable. Comparing it to Ryzen is almost an insult to Ryzen. I think Linus is trying to be as nice to Intel as possible here- they did just allow him to go through their Israel tech center labs. I'd rather you burn that bridge than play overly nice to such a large and profitable company that shouldn't be pushing basic R&D on their customers.
I don't think giving intel more shit would have made the video better. They're new to the market and some problems are to be expected. I found it incredible, that they outperformed NVIDIA and AMD in some tasks, which do this for decades. Most of the problems was in software and drivers. Just wait three weeks until Handbrake support is in stable and the newest drivers ship with windows update. In this respect, i think the situation is even better than ryzen. Its a completely new kind of product for intel and it works great (when its stable).
Personally I don't trust LTT's opinions on Intel. Data? Sure. The opinion pieces they always throw in at the end? Might as well just close the video there.
Intel and their endless pockets couldn't have dropped the ball more than this release its going to leave a sour taste in many consumers mouths especially those who might not be as tech savy as enthusiasts, I hope they can turn it around because competition is what we need right now.
I was intrigued by Arc only because of its existence, I was under no illusions as to its performance relative to nVidia and AMD, Still, the fact that it underperforms to this extent is a surprise and a disappointment. Intel will likely still plug at it though, the opportunites are too tantalising to ignore
It's not a surprise. This card will have an MSRP of $150 and is marketed below the 3050 to which Linus compared it to. As he said, it's the "best" LOW end ARC card. The actual high end cards will be launched in 2-3 months.
At 9:24 you say, "So it's clear that Intel needs more time..." A year has passed. How about a fresh look? The latest Intel Arc software update seems to have made real progress, so I would like to hear what you have to say about the current situation.
In terms of CPUs what REALLY gets me is not speed or cores but ever decreacing power needs. I always imagine an entire university or company full of computers which combined use less Watt than a normal home user PC from 2010 💪🙏
People need to stop the whole "Early Adopter" trend until companies go back to *AKCHUALLEEE BETA TESTING* all their hardware and software. Shame some people (looking at you, gaming industry) shoved that pay-to-beta model out the door and put us in this situation in the first place. Edit: I was a QA tester at Dell for ~ 8 yrs, so yes I know how it goes. The stuff Linus mentioned should have never been let out the door.
Any updates on whether the bugs/issues are improved on the latest HP Spectre x360 models with 13th gen i7-1360P CPU and Arc A370M? Or still recommending to avoid?
Fuck... I'm not trying to game, I just want a two in one laptop/tablet thingy that's powerful enough to handle my creative needs (3D and whatnot). And the ones that initially caught my eye have this piece of garbage in it.
@@ICDedPeplArisen they got a free license way back when they reviewed the program (tried to upscale a low res video to 4k iirc) so it should mean future reviews will include it when relevant
It'll get better sure but Intel has a habit of abandoning their iGPU drivers. Most are struck with an obsolete OpenGL version and many lack Vulkan support in spite of the hardware being capable enough and similar enough to later units that they could backport the changes. But they just don't. So I think everyone who buys one of these units runs a high risk of being stuck with a bit of a nasty brick on their hands for good.
At least when Ryzen launched they had better price2performance and more cores to offer. I don't think Intel can humble themselves enough to be compelling... to much ego
Yep, I think this should have launched where the laptops are very clearly at max $1200. This is a budget offering in the current market no matter how much Intel wants to price it higher.
It seems history has repeated itself once again. I'm old enough to remember Intel's first foray into discreet graphics, the dumpster fire known as the i740. That one was so bad that it made Intel focus on their line of IGPs. I really do hope that Intel succeeds this time in bringing a formidable third player to the table of gaming GPUs.
Good video, LMG crew. Despite delays on delays, Intel still hasn't fully ironed out the bugs in their software. But as was noted, when the software _does_ work, it works well. The hardware's there, it just needs better software. Calling it now: We're gonna be making FineWine™ memes about Intel in a year.
@Benjamin Oechsli You're sadly mistaken. Intel is going to have a rough few years and that's par for the course. The industry ebbs and flows with peaks and troughs within that smaller companies get either get brought or go bankrupt. Intel is no small company so they will be just fine and some will say (myself included) that's what happens when Intel has stifled innovation for years and charged obscene prices. There is nothing like a healthy wallop of competition to bring prices down to a more afforable level. After all years ago a 8 core 16 thread CPU would have cost an insane amount of money and would have been a HEDT part on Intel's platform as AMD couldn't muster any competition - I paid £193.97 in 2019 for my 2700X. Lack of competition from AMD shouldn't have meant stifled innovation and minisucle improvements with one generation of CPU support. Intel could have stuck with one socket for many generations but they choice to milk consumers instead. So, yeah for that Intel deserves to fall into a pit of broken drivers and annhiliated for a few years as only then it would make them humble for abit. The key word is 'abit' as Intel like any corporation is profit driven. Having said all that Intel does need to provide competition in the GPU space as both AMD and Nvidia are profit driven entities. So, in this mess, AMD is the lesser of three evils as only they are currently pushing innovation and personally, as a fan of tech. I will always support Innovation and Competition especially if it means increasing the perforamce per watt as we need to be using less energy not more.
Arguably the hardware isn't the buggy part here; so eventually even these garbage units will work, if they don't just stop updating them for no reason at all (they frequently do with their iGPUs). But who needs a sort of a semi brick in the interim?
Wow, I've just realized that I'm the target audience for the higher tiers of this product, assuming that it'd work consistently on Linux. Keep the videos coming! If it's this dodgy on Windows, Linux will be the same or worse, but I want to see when the situation changes.
Since the 4000 series will probably be a huge disappointment due to the rumored unjustifiable power draw increase, I'm still rooting for Intel and I don't think they should be in any hurry to try and get it out before Nvidia does. Get it right first and when they did, flood the market with cards at a loss and they're in business.
@@AGuy-vq9qp Unfortunately performance scaling is not linear over power draw. The performance gain going from let's say 100W to 200W is significant, doubling the allowed power draw in these areas can lead to nearly doubling the performance. Don't expect the same going from 200W to 400W or 400W to 800W. Yes, you will gain performance, but in the end you have to ask yourself is it really worth it. Not much pros for ever increasing cons. Not only having a power guzzler in your computer (and potentially tripping your breakers every now and then) filling up quite an amount of space in the case (triple slot cards, maybe even quad slot cards?) and heating up your room.
It isn't just better to be rooting for AMD? I mean, the 6000 series (Especially the 6600) are great at power per watt and have great overclock and undervolt performance, and the 7000 seems to be better at it by 50%
@@xPandamon They probably allocated as much resources as the board of directors would allow. Since their dGPU field is unproven/not guaranteed to be successful, they probably refused to give them the full support they needed to make this a smooth release. If they can iron out these issues soon and take out even a couple of Nvidia's Linux/AI clients (the Linux drivers are in a far better state at this point), they'll be able to warrant the funds they need to maintain a full driver stack properly.
honestly the video rendering numbers are pretty solid if they can get this working as integrated graphics for thin and light stuff like with a cpu filling the 1165g7 role that would be pretty sick for creators
luckily, driver issues are fixable after launch. As an entry level mobile workstation card, the performance seems right on the money. The issues are the bugs, and maybe the price.
Last week I bought a Lenovo Yoga 7I for college and it has that very same chip. Was kinda disappointed at the heat it generates without even stressing it too much, and at the random lag in the system. If I recall it correctly, the laptop should be optimized for deep learning and AI, but I doubt it will be able to do that, currently.
I feel like Intel could tackle bugs and PR at once if they seeded out PCs with their graphic cards to content creators to report bugs and put the hardware through as many paces as possible. Seems like a win-win.
Idk, Linus seems to point out all the bugs on his Framework laptop that he's invested in. I think creators can use it as an Auxiliary production device, and they can catalog bugs and show when Intel fixes them, or when they ignore it. Seems like a win-win since Intel could use some consumer trust especially for ARC, and seeing them fix bugs for creators would boost at least my opinion of them (creators are likely to run into more bugs than anyone). Maybe I'm alone lol
Who thought taking the team that tried to run radeon graphics into the ground to build new GPUs would be a good idea. Raja is a flop, intel ARC is a flop. D.O.A.
I'm glad Intel is diving into the GPU market, even though this is dissapointing, give them time and let's hope for them to be able to compete with Nvidia and AMD at some point.
Hi! Any comments on how is Spectre x360 with Intel Arc performing now, ~4 months later? Have the new drivers and updates from last weeks/months improved anything?
But you know, some people do use computers for other things than gaming and video creation. A big part of making a product successful, is building volume. It can be a great laptop for a lot of people even if it isn't good at specific things. For instance, if it supports GVT-g on Linux, then it might be the absolute best laptop on the market for some people, particularly people like me, who has to run Windows in a VM from time to time. Having hardware accelerated guest graphics, is a huge thing on Windows. You'll pay thousands and thousands of dollars to get that from AMD and Nvidia and you're not getting it in a laptop.
With any other company this would be expected with an entirely new line of products, but for Intel I hoped better. I hope it reaches beta before 5 and 7 launch. I'm still so excited for what it will be, though. I'll be following it, but I doubt I'd consider it before Druid
The big difference between Ryzen and this is that Intel has the money to hire more software devs to help solve these issues, and haven't done so. AMD didn't have the money to solve those firmware and memory issues quickly, at first, but they did have more people on the issues. Intel's driver team is woefully underfunded and understaffed, and management refuses to increase the budget or hiring to solve the issue. It's been this way for many years.
Some of these low level challenges are literally one man jobs and hiring more people in those cases won't make it faster or better. You just need a John Carmack type of big brain dude for the task. Very difficult stuff and throwing more money can give zero improvements in these cases. 2 pilots don't make plane fly 2x faster.
In some cases, Intel just has to opensource their drivers and software and much of these issues just go away (simply thanks to community efforts). Their product support is beyond terrible. I've been forever waiting for any kind of GPU driver for the GMA 3600 to be released. The original drivers were only released for Win7 x86 and there is literally no way to port them or make them work anywhere else! Not to mention they are extremely unstable and cause consistent BSODs performing specific tasks. These BSODs are not random, they happen at the exact same tasks and time. These are things that should have been solved and never were. No amount of complaining in their forums ever resulted in a solution, they simply abandon their drivers mid development, resulting in exactly the user experience we saw in the video.
Intel’s Arc chip architecture definitely looks like an AI-accelerator first, 3D-renderer 2nd. So for gaming, it’ll eventually be passable, but for video editing, computer vision, procedural architecture, dynamic field sim in engineering, big data processing, … and about 100 field/use-case combinations, this may end up becoming the best bang for the buck, with big corps buying 1,000,000s of these.
Really disappointing... I'm looking at which Yoga 7i setup to buy, for audio and video editing rather than for games. There's a 14" or 16" 1260p with integrated graphics and a 16" 12500H with an A370m, all for the same price. I thought the latter was the perfect combination, on paper the 12500H should perform better with it's higher power consumption, and while I don't expect a lot of GPU boost, the additional 4GB VRAM should help. Then again, the H probably runs too hot in a 2-in-1 and if it's throttled by thermals and GPU driver problems, I might as well stick with the more efficient package. 🤔
At the end of the day will Intel put out updates that will fix the Arc A370M? To put it in another way, is me Asus Zenbook Flip Arc A370 2 in 1 a brick or will updates help it in the future? Should I just return the damn thing?
@@chuckyfox9284 I highly doubt they started to design the M1 only when they announced their break off from Intel. It was likely a year or two before that. But yes, they are just built different.
When I hear, that Intel is focusing on rendering I get flashbacks to the recent video hosted my Anthony. When he praised the Apple silicon for its rendering power (and forgot to mention that it sucked for any other 3D application, because it is basicly a freaking ASIC), I think that is what Intel tries to catch up to. Make a Mac Studio/Mac Book competitor for all those stupid RU-vid-hipsters... This GPU is, judged by the architecture, not for gaming! Like the Apple silicon.
Indeed my thoughts as well. The M1 is an encoding beast. On PC people buy 3090s for encoding video, which means that most of the GPU is just sitting and doing nothing. I found it strange that nobody is making encoding acceleration cards. I think Intel is making a smart move. Too bad their software department is dropping the ball.
@@Myvoetisseer Intel did something similar with their Xeon Phi CPUs (they came out the same pool so to speak, aka Intels dropped Larabee GPU project). But you are totally right. I think nobody expect Apple made such a card (the one for the Mac Pro which put Final Cut on steroids) because the requirements for editing softwares are a mess and keepd changing all the time. Premiere for example went from only supporting up to 6 cores to supporting many cores and ending up with favoring clock speeds... Compared to games, it is just a risky gamble to develop such a kinda niche product. Apple on the other with his asics acts more like a Xbox or Playstation. A fixed hardware set to which a software can easily be optimized for. Always their strongpoint (right after stealing ideas and shady marketing). So you can't blame Intel for trying, but I am not sure if it pays of, even if the driver team gets their sh*t done. On the other hand, Quick Sync IS a great feature for editing and such. I guess only time will tell.
i`m searching for a new laptop at the moment. I need it to be power efficient because i`l run it on solar power in my camper van. I wanna edit videos on it and maybe play a game every once in a while. So that could maybe be the asolute perfect thing for me, right?
As someone who has daily-drove both AMD and Nvidia cards, I can honestly say that neither of them are flawless either. On top of that, this is Intel's FIRST generation of DGPUs in, iirc, over 2 decades so people need to be a little patient. I also hope Intel looks at and listens the amount of people saying "Keep going!" or "I'm waiting for 2nd or 3rd generation" because they have a lot of potential and if they try, I'm sure they can be right up there with AMD and Nvidia! I just don't want them to make this first generation and stopping entirely, only because they had some issues that cause it to be walking the line of being a flop.
walking the line of being a flop? Walking the line? They had to have personal hotline to intel to make it work for one of the benchmarks. If this isn't a flop waht in the goddamn blue sky is? What does company need to do to be a flop?
I'm not worried. Getting this right first time would have been a techo-miracle. This is just a road bump. And lets not forget it is the drivers for the chips that really nail down the details anyway.
@Maxsonyte Apple uses ARM a well established arch. which theyve been using for mobile chips for years not to mention they dont have a dgpu to support thousands of games
Bear in mind that Windows Update screwing you on driver updates you've manually installed from Intel even happens on *Surface* devices. And that's been the case for years.
@n30n lmao. They got better engineers and more money than AMD. They got everything by their own. They can demolish AMD without even going to advanced process node.
@@theexclusive9227 nah not really XDDD there 12900k can only compete because of the DDR5 and double the power limit. In the time AMD is keeping up, intel will lose hard so just wait till the next generation of amd ryzen cpus and you will see how fast this will change
yeah this is about how i figured it would go but at least intel is able to take the hit for the most part and come back better next time learning what did and didnt work out, cant wait to see how good they can become in the DGPU market
When you hired the guy that created Vega to lead your team. Perhaps AMD merely allowed you to poach Raja Koduri because it looks like you lost that one.
From what i know, Intel has always struggled with Driver issues for their Integrated Graphics. Guess that is being carried on by their Deducted GPU's as well.
Makes me sad how both my Helios Predator 300 and Spectre x360 from 2019 are thought of as better value than their more modern counterparts. I mean I can sell my 300 for around $1,000 on Amazon, that's nearly $100 more than what I bought it for.