I stuck an Arc A770 in my old CAD computer after my Quadro M2000 failed. For $300 it has been one heck of a GPU. I’ve opened up files of entire jet engine assemblies and it runs it just as smooth as a 3090ti. Definitely not disappointed in it and I’m probably going to get another one whenever they release the next series of Arc GPUs.
@@MU-we8hz it’s pretty good with Adobe Premiere Pro but a little choppy with DaVinci Resolve. I’d buy it if your computer is a 12th gen or newer Intel. The AMD RX 7600 is pretty good for how cheap it is.
From what I understand native implementation of resizable bar was implemented in 10th gen Intel. So to fully realize the capabilities of arc GPUs you need a 10th gen or later. I've been running an a770 for a minute now, but the issues with Linux gaming are becoming a significant challenge. I'm changing my rig right now and my wife is getting my a770 for her because it's ideal for t kind of creative work she does and I'm passing stuff down with my son ending up with my brother's 9700k system and a 3060 my brother with my current locked 13400 so everyone should be happy. I generally love my a770 when it's compatible with a game in Linux it's absolutely beautiful.
I'm not a creator, but I love the content. I paired my Arc A770 16gb LE card with an Intel i7-12700k. It should last me a good long time or until the next Arc GPU arrives. Peace
What you said about the Nvidia range is so important for video editors and I hope a lot of people catch it: for video editing, the 4070 is a decent choice but all the 4000 series cards below it are really not good for video editing. The A770 is a better choice for that.
Great to see this after the A750 video. :) The Blender performance could just as easily be a reflection of the state of the support by Blender for the A770 as Intel’s software’s support for Blender, but either way it’s not a reflection of the performance capabilities of the card. While particle systems are often CPU limited as opposed to GPU limited, you’ll still find that in the 3D mode of for instance Particle Illusion Standalone (part of the Boris FX suite) the A770 not only performs better but often renders scenes faster than the 4070 (possibly due to differences in CPU overhead between the respective drivers). Of course, once we see better support it could easily be a similar situation to what happened going from early Premiere Pro 23 versions to the 24 beta in terms of performance gains. I just hope we don’t have to wait as long for that as we did for decent AMD support in Premiere Pro. :) Enjoying all the GPU content. Sent you an e-mail about a possible future collaboration.
Hey man. I want to primarily use this intel arc A770 for davinci resolve fusion. Especially Davinci resolve Fusion Template editing. I use render cache for fusion template editing but I'm not sure if Higher Vram is useful for render cache performance especially in fusion. Should i go got 4060 or A770 ??? I even have doubts if the fusion templates editing relies on CPU over GPU. I'm not sure. Please help me out man
@@awesomestuff2496 For Fusion, the best cards under $1,000 are the AMD 7000 series. They are performing a lot better than similarly priced Nvidia cards from both the RTX 3000 and RTX 4000 until you get up to the 4080. Do not buy the 4060 series - if you need to save money, get a 3060 instead. The 4060s are a downgrade from the previous generation for video editing. Here are my PugetBench Fusion scores for some of the cards when they are used on their own (no Intel iGPU helping) on my 13900K system. A770 in DaVinci Resolve Studio 18.5.1: 289 RTX 3060 in DRS 18.6: 465 RTX 4070 in DRS 18.5.1: 510 6800 XT in DRS 18.5.1: 642 7800 XT in DRS 18.5.1 : 705 7900 XTX in DRS 18.5.1: 772 Note that RX 7800 is 38% faster than even the more expensive RTX 4070. So in your case I would be comparing the Arc card to whatever RX 7000 card is in your budget, not the 4060.
Finally a video that helps my conformation bias 😂 But seriously... I just got Sparkle A770 OC for content creation purposes. It was 400 Euro, by far the cheapest card with 16g of vram. I'm glad to see that it plays well with Resolve. Thanks for the detailed review.
I'm a DaVinci guy that's starting to dabble in Blender too. I have a "heavyweight" system on the hardware side and definitely have the room, power supply and airflow case (Meshify 2 XL) and PCI slot to plug an A770 in. I'm just mulling through my head on how to make them play Nice with each other. Like use my 3080Ti when on Blender and the 770 when I'm on Blender.
Good!...we NEED intel in this game to pressure Nvidia and AMD. Nvidia?....they already seem to have made a mental shift to the AI industry. If they do that? This opens a door for intel to get a stronger foothold on gamers and creators. Intel....play your future cards right and you could see a creator "shift" moving more in your direction.
after they launch i would love if you could review a meteor lake based laptop in December, imo it might make for the best creator laptops available with their insane efficiency, performance and essentially an A380 for the iGPU which brings all the Arc goodness to thin and light notebooks
Good to see more competition in the gpu space. Nvidia needs to be brought down a peg. I'd love for you to revisit Blender performance once 4.0 drops since it includes HIP-RT support for AMD hardware ray tracing (and Embree for Intel). It's in 3.6 already but it's experimental and has some bugs. I'm in the market for an upgrade from my gtx 1070 ti and want good Blender/gaming performance. But I'd rather not pay Nvidia's exorbitant prices.
DOSBox doesn't work with the Intel Arc cards anymore since about 4 months ago with driver 4676, and Intel doesn't seem to be interested in fixing that. Keep that in mind before you get an Arc GPU.
Its nice to see intel gpus performing better than a higher priced competitor card. I kinda want an intel card just to fill in the empty space of my motherboard lol. Great for AV1 encoding when i return to streaming.
So this might just come down to "it's up to you" but recently I ended up picking up a RTX 4070 thinking my RTX 3060 was broken. That ended up not being the case, but after seeing both the A750 and A770 reviews, for someone who primary works in Photoshop, Premiere, and occassionally Davinci, would it be worth saving a bit of money and returning the 4070 for one of the intel cards, or is the performance uplift of the 4070 worth keeping around? And for anyone wondering, I was planning to build another computer, so I was looking for another card anyways.
This had been my hype from the start, because we all know that Intel is very good and optimized in gpu encoding. It's just that until the release of ARC gpus, it's all limited into a weak integrated graphics. So, this ARC gpus are just a breakthrough. I hope the intel discrete gpus on laptops will soon enough expand from just the iris xe max and the a370m, which is just obsolete compared to the red and greens.
I'm mainly working in Premiere Pro for editing and use an RTX3060ti. CPU is a 14600K. Lots of work with OBS and Animaze for creating avatars too. Is it worth changing to this card, or could I keep both and use each card for different types of job depending on the integration?
Does it work with Lightroom Classic Denoise AI? That's a critical point for photographers - lots are upgrading GPUs after seeing the wonderful Denoise results. Upgrading CPU doesn't help, but the GPU makes a huge difference - Denoise took 5-6 min per 20-mpx Canon R6 CR3 RAW photo with an Intel Nuc's integrated Iris graphics. With the 3060 it takes 10-12 sec. It's what persuaded me to build a budget PC instead of getting a Mac, since it's been 6+ months since Apple and Adobe began trying to figure out how to implement the M2's neural engine for Lightroom.
Please video comparation Intel ARC A770 16GB VRAM vs Nvidia RTX 4060TI 16GB VRAM Best 16 GB VRAM GPU in Davinci Resolve? Best performance in time-line 4K60FPS10BIT4:2:2H.265? Best AV1 Encoder?
I love your videos bro but do they have to be 25FPS? On high refresh rate/low pixel response time monitors they look horrendous. It's not a movie, just shoot in 30FPS please my head hurts.
I actually bought this in July cause I wanted the most v-ram for price and could not find anything better than 16g for 330 basically was the price which compared to other GPUs is hundreds of savings
So is the Arc A750 worth the gamble? And worth pairing with an i5 12600k CPU Im a graphic designer using photoshop and Illustrator but will be dabbling in some motion (Ae) and potentially some 3d (blender) as well as some light gaming. Im very interested in seeing where intel go with the deep link technology… loving this channel btw 👌🏾👌🏾
I got RTX 3060 12gb for 180$ locally. Depending on workload in some intel wins and in some Nvidia win. So they trade blows. Nvidia has lower power consumption. Performance I would call a tie. But it cost over double. So how that’s a great deal? Am I’m missing something in the picture?
Rubbish! I don't know how case and GPU fans work (electromagnetic or magnet). If they don't have a magnet, nothing will happen. If they do, then yes, they can generate a voltage. But fans are controlled by a switching voltage chip and these only work in one direction. It's not like a transformer. Also, the electronics for the fans are as separated from the other electronics, as fans generate electronic noise. Not something you want interfering with the rest of the electronics. The only way to generate a damaging voltage, is to use high air pressure, spinning the fan beyond its limits. And if it was even possible to fry the switching voltage control chip, the voltage could go no further. So worst case scenario, your fans can no longer spin. But I'd say 99,9% for sure, this would never happen. And only with high pressure air.
Great video 🙌 Would it be suggested, at this price point, to equipe x2 A770 and a 13700k? Instead of a 4070 ti ? Intended for Davinci Resolve. I wanna build a creator pc this black friday and im thrilled that my options expanded out of 3060 or higher end 4070 ! 🙏 Let me know anyone !s
I got arc a770 limited edition since January. Happy with the card, however i am sending it to RMA currently. After new drivers release i keep getting artifacts and black outs of the video.... Went all the stages through intel support and seems that there is an issue with the card....
Rubbish! I don't know how case and GPU fans work (electromagnetic or magnet). If they don't have a magnet, nothing will happen. If they do, then yes, they can generate a voltage. But fans are controlled by a switching voltage chip and these only work in one direction. It's not like a transformer. Also, the electronics for the fans are as separated from the other electronics, as fans generate electronic noise. Not something you want interfering with the rest of the electronics. The only way to generate a damaging voltage, is to use high air pressure, spinning the fan beyond its limits. And if it was even possible to fry the switching voltage control chip, the voltage could go no further. So worst case scenario, your fans can no longer spin. But I'd say 99,9% for sure, this would never happen. And only with high pressure air.
but how about gaming subject? intel claimed they fixed the performance issues with the latest drivers is this real chat? can i go for latest games with a770 + ryzen 5 3600 combo?
So I have been using a sparkle a770 for video editing for the last few months it’s been fantastic in my PP timeline, super smooth. But I’ve had a persistent issue rendering videos with weird glitches in them. I’ve had to resort to software rendering my client videos. I haven’t been able to find anyone with similar issues or solutions, so I bought a replacement to see if it was the card. Still have the issue. At this point, I’m getting a 4070 as soon as I can swing it. Anyone else with this issue?
I have a dedicated streaming/design pc. And I really want to upgrade to this card from a 1660ti 6gb. But I'm looking into avermedia's new HDMI 2.1 capture cards and they are so Nvidia biased.
I dont know about their new capture card but the one i have works with pc and mac and does 1080p 60fps game capture. Theres no preference on GPU from what I’ve read in my case anyway i mean.
The Acer predator software doesn't work. I have this card, and it really shines when you put about 200mhz on the clock. I tried the predator software and it did nothing for me either, so I went to the Arc control center and increased the Max wattage and added 110mV to the core. 2.63ghz and it made a disproportionately bigger leap in performance. Atleast in gaming for me, idk how workstation loads will use it.
Just calling them "cuda" even with the amount of time you spend saying that they are not the same is not only misleading, it is cringe-inducing. Having it on the chart like that is horrible as some people just look at videos like this for the charts. I didn't finish the video after that, and I suggest other viewers shouldn't finish the video either. It completely destroys any pretense that this is a knowledgeable or authoritative source.
I dont see this being beneficially in editing ...at all, i mean why not get a 7800xt and beat them all for same or less price..arc is great more compettition etc...but to comapre tot he wordt csrd.on the market sayd nothing 4060..worst in everything including value to performance..i like ur channel, i love the z690 aero d (2.5 and 10gbe thudnerbolt 4, etc you helped me pick it) but this comprisoon doesnt really day arc a770 is good for creator rben though it is a good csrd..thanks
The irony about these nonsensical stupid marketing names for GPUs, is that once purchased ... no one will ever use the name again. They will just refer to the GPU model from the OEM!!!
Will you be revisiting these cards if/when they give more support for 3d softwares? Or when they improve performance through driver updates in general. Also, can we expect an a380 review as well? Even tho it's a low end card, creators on a budget might appreciate the review.
WTF is sample per minute? What samples are you talking about? How big is the chunk size? How many light bounces did you set up? Didn't the different amount of "samples per minute" in different scenes on the same gpu clued you that it's not a viable metric of comparison? Did you enabled optiX on nvidia card? Why are you inventing never heard of metrics when there is already exist easy to understand & compare time to render? Also why there is no AMD Cards in comparison despite blender cycles being one of the very few gpu render engines that works on all modern gpu architectures? And amd having cards that are much closer in terms of vram for the same price?
I got the A750 model, got it in april, thought it would be good for my r7 5700x cpu!didnt know the xe cores had to be activated by an igpu!bought a 11600k z590 mobo set up this month!gave it to my nephew, my mother got the 5700x pc!so I cant wait to buy z790 mobo next month, and getting it ready for a 14400 cpu!the 14600k will probably be about a months rent for me, so not this time!!!using a r5 3600 which is pretty slow for video editing!
The Predator card gets really hot and ran into the 90 deg C range when I had it. I swapped to the ASRock card and it never goes above 75 deg C playing games or running benchmarks. Plus, the Predator is $399.99 and the ASRock is $329.99 currently at MicroCenter. I got mine when it was $299.99. Best bang for the buck!
Around 4:15 ... Intel Arc 770's microarch is Xe HPG. ACM-G10 & DG2-512 are the codenames. GA106 (Ampere) has nothing to do with Intel Arc. TSMC N6 is NOT 6nm but 7nm process!
I use Shotcut as a video editor (what I do is very basic). To render, it takes advantage of the codec through the processor's integrated QSV, but I am almost at a 1:1 ratio, which for me is very slow. Since I also use OBS, I would like to know if the investment jump to a low-range Intel is really worth it or if to go with the safe codec change by acquiring a low-end Nvidia.
sorry to be that kind of person, but if A is 80% faster than B, B isnt 80% slower than A, for B to be 80% slower than A, A have to be 400% faster than B
I recently did a test with it running programs I use like Cinema 4D, Unreal Engine 5 and Embergen and it did ok for the most part. It wouldn't run at all with Embergen though but with Unreal got decent results. For a few hundred bucks it aint bad I'll stick with my 4090 for now though 🤙🏿🤙🏿
I recently got a 1440p monitor and had to pick a new graphics card for gaming. To be honest, my budget wasn't enough to go for RTX 4070 or even RX 6700 XT / RX 7700 XT. My options were RX 7600, RTX 4060 and RTX 3060 which is.. Not a decent choise for QHD gaming. Accidently, I managed to find A770 with 16gb for lower / equal price compared to the options I had. The VRAM as well as it's pure performance after drivers fixes made it a decent choice for me. It can even carry RT better than AMD, though I don't care about RT. The only thing I've lost is Nvidia reflex and DLSS. 😂😂
I would like to try using a NVIDIA card and an AMD card with a Arc A380 and see how they would work together. The A380 cards are small and some do not need any power. Creating a rig using both would be awesome in content creation and streaming. Using Intel CPU with on board graphics give you some advantage. I would liike to see AMD APUs and such tested to see if you can get some kind of boast as well. I know that 6gb mem is not a lot but with not a lot for the card to do that AVI encoding and features should be helpful to older cards Like your RTX 3070 that does not have any of those features.
I am happy intel is competing, but they really need something stronger. If Battlemage dropped at the start of 2024 they would likely be making some good sales (if drivers were alright). But it seems they are still fighting to deliver.
Blasting this bad boy (LE variant) with an i5 12400f and it's just pure magic. And for blender or editing videos it's good enough :) loving it, TEAM BLUE.
For Ryzen Users! Nvidia or AMD Primary GPU and Any intel Gpu as a 2nd GPU can give you better performance working on DAVINCI and Premiere working with 422?
I love my A770 - but there are 2 serious caveats for creators. These two "known issues" have been on the release notes at least since Feb 2023: "Topaz Video AI* may experience errors when using some models for video enhancement." "Adobe After Effects* may experience an application crash during render operations." I haven't used Adobe, but I own Topaz Video AI, and it hasn't worked since February. 9/10 workloads fail with a FFMPEG errror. It's a shame, because it was one of Intel's promotional applications showcasing Arc's Deep Link technology.