HD 4600 was also one of the fastest IGP of 2014, so this is a best case scenario. If you were using HD 2500 or older motherboard chipset graphics, a GT 710 would be night and day faster.
GT 710 is crippled by 64-bit DDR3 memory interface at 14.4GB/s. The GT 710 GDDR5 model shows massive performance uplift with 40GB/s even with 64-bit interface. At least the IGP gets access to dual channel DDR3-1600 at around 25GB/s despite the GPU core being much slower.
I agree here. The ddr3 has to be the bottleneck. With more shaders and more ROPs, there is no way the 710 looses unless there is bottleneck somewhere else.
My T440p has a GT 730M (GK208) with 64-bit DDR3 and it does way better than the integrated HD 4600 (i7-4800MQ). From what I've tried in GTA V the HD 4600 does 720p at 20fps while the GT 730M does 720p 60fps with the same exact settings. 1080p 30fps is also doable on the GT 730M. Though of course the GT 730M has 2x shading units and TMUs compared to the GT 710.
intel hd 4600: - takes no case space - doesn't use pci-e slot - doesn't make extra heat - uses less power - has intel quicksync, a faster video encoder than nvidia and amd gt 710: - exists
Used to have a GT 710 (2gb DDR3) that I used on an i3 4160 (HD 4400). I did similar tests myself about 7~8 years ago and they were just on par for or perform better on the HD 4400 weirdly, more so with an overclock on the igpu. But I stuck with the GT710 because I needed HDMI and my sh!tty Gigabyte board (that died later) at the time didn't have anything but VGA.
The vram setting in the bios doesn't mean what you think it means with integrated graphics. If you set it to "1gb" then it will always allocate at least 1gb but since it's shared it will use what it needs, up to 2gb or 4gb depending on the bios limit. Ex: If you set it to the lowest, ex "128mb" and run a benchmark, it will be the same result as "1gb". It will use more vram than 128mb. Level One Tech Wendel once did a video where he explained it much better.
Now I wish I could see AMD HD7560D and AMD HD7540D against Intel HD 4600 and GT710‚ those were some nice mid-range iGPUs included in some AMD's 2012 APUs. HD 7540D was close to 300 GFLOPS and HD 7560D was around 380ish GFLOPS. And then in 2013 they made some slight improvements with HD 8570D and 8470D.
Huh, I never noticed how smooth sub-30fps can look and feel depending on hardware configuration. With me still using HD 4600 and tried GTA IV (and in general, I don't like GTA at all, don't engage me on that.) and with me lazy-eyeing my screen all the time, the oddity did flew over my head though.
Pretty bad considering that it was released after Haswell, and even worse that it is still possible to buy it new. With the allocated RAM, all you are doing is reserving it so that the iGPU always has some availabe (more important the lower the total amount of RAM). It always uses more if needed.
Bios shared graphics works by telling the system how much system memory is to be reserved for the GPU. A larger amount is not a good thing. Walling off memory for the GPU prevents the system from using it as part of the system memory ram pool. It is held exclusively for the GPU. Even if othe GPU doesnt need it, the system cannot use it. However the reverse is not true. The GPU will always be able to use more system memory once it uses it's allocated memory and finds it still needs more. It is therefore better to assign a minimal amount to the GPU and just let Windows handle memory management between system and GPU rather than walling it off.
I had a Dell Optiplex desktop, a 9020, that had I believe an i5 4670, and it had a Radeon 8490 or something - that was an upgrade in the original configuration -- which was actually slower than the HD4600. It made no sense to even add it for dual GPU capability - the board had multiple video ports already.
I recommend overclocking the 710, as the 700 series cards really gain more performance when overclocked. Remember I had a GT 730 a while back and got about 20% more performance. I'll try my spare 710 I use for testing mobos with and see how high I can run it.
I really liked this video, though I am confused on something. How were you using it as an encoder for jellyfin if the GT710 doesn't have an encoder? The GT710 doesn't have an Nvenc chip.
Nvidia have the advance optimization on driver, this happed far behind when the age of 2005, GeForce 6100 igp only have 2 pipe and 1rop tmu texture,mean while intel gma950 have 4142 on the core config, with dual shader per pipeline, but nv igp is still faster than the intel crap,and more stable than ati x200
Integrated Graphics don't have dedicated VRAM and borrow resources from regular system RAM. The reason the GT710 scores higher in some games is due to their being less system RAM available with the integrated intel graphics when it allocates some of that RAM into VRAM. If the system has 8GB of RAM and you use the GT710, then the system can still utilize all 8GBs of system RAM as the GT710 has it's own VRAM. But when it comes to the intel integrated graphics, it will take up to 2GBs of that RAM for VRAM, leaving just 6GBs of system RAM left over.
For these Haswell procesors, if you need to add more monitors, then it is possible to use DisplayPort MST hub (if motherboard have DisplayPort) or use dicrete GPU, that is actually faster than iGPU and have better features. It doesn't make sense to pair discrete GPU or display adapter with CPU, that have faster iGPU, that particular card.
With all due respect who said anything about pairing. Yes. If you install a gpu you'd use IT'S ports. If absolutely needed you can do a pass thru. This test was just about comparing the performance of each.
about u'r confusion with the VRAM, when an integrated GPU say it use "1GB" it's not guaranteed that it use only 1GB it's just a driver thing... it can use way more than that and it will be show as system ram use ( too bad u didn't showed that stat). best way do see that is with AMD APU like the 7840U that always say that they use ~500MB VRAM when the VRAM allocation is set to "auto" but the system ram show way higher usage.
Hello! Bro, Windows can share some amount of system memory to any GPU, up to double VRAM. It's nothing really fancy, it's just pre-allocated memory so the system it self doesn't use it and gives the GPU a better fighting chance. This behavior began on Vista. My notebook had a 7600go with 256MB of VRAM, but Windows Vista and 7 would show 512MB. However, for obvious reasons, if a game would make use of that shared memory, the performance would drop considerably! Cheers!
It's the videomemory allowing the oc to be effective. My HD 5450 with 1GB of ddr2 got pretty much nowhere with sizable +190 core due to memory not ocing well, only +60. WIth gt 710 it's the same, no point to even try on ddr3 versions, any ddr videocard is 30-50% worse than the gddr version. Still Intel showed good performance with dual channel ram. Would be more interesting with single channel ram, would it make a difference?
hey jim i have a possible solution to the memory allocation thing, shared ram spills over from vram into the standard ram pool when you go over the cap in the same way dgpu's do when they run out of vram but without any performance hit due to it being the same speed as the allocated vram so long as you have the capacity, its likely that the games and applications seeing 2gb are doing that because they are requesting more than 1gb, getting it and showing 2gb
actually the 710 doesn't even support Nvenc, you need at least a 750 to have Nvenc support, something funny is that the 1030 is just a bit less powerful than a 750, yet it doesn't, that makes me believe they simply cut Nvenc off any GT card
oh thank god i didint whasted my money on a 710 cause i have old ddr3 pc with intel 4600 graphics and i planed to buy a 710 but now i see its pointless just whastin money on less performance
Even GF710 is worse than Intel Core 4Gen iGPU? (Most of the time, but still, When your FPS is higher but with the cost of Input Lag... I mean... No, just no.) Damn. And while I Shouldn't be so against them, (GF210, GF710 and etc.) but the marketing of these things was a total BS, they should just sell them as Quadro, and not as GeForce cards. An Average Joe won't probably know that GF710 is a very poor GPU for games. They will probably believe that if it is newer, it is better. (Well, maybe if he had a GF210 before, then Maybe. Surprised a bit that at least beats an 3Gen iGPU.) If you have a 4Gen+ PC with an iGPU... Just go with it, at least this thing will handle games better. Don't bother with GF710. Also... Not to mention, how painful was going from a 7600GT to GF210... When my grandpa switched to i3 3Gen the iGPU was performing way better than GF210. And maybe worse than 7600GT... But I finally could play CoD4:MW (2007) beyond Low Texture Settings without the feeling this game's running on a Potato after 7600GT's Death. (Literally, it stopped working at all.)
Unfortunately, the RX Flop series/Flop VII compete against the Titan V/RTX. I mean why else AMD named them so differently from the normal RX 4/500 series?
Fwiw I did a video comparing a 710 vs 240 and the bargraphs at the end included the hd4600. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-UvX2rnSwaeY.html