@@RandomGaminginHD Can I ask you how do you find this kind of pcs for this price? I can't find anything under 200 € and I don't know how to find "broken pcs". I have a 745 and I wanted a pc under 50 euros.
@@TimDragonDeckers yes it does actually, windows typically hides 480p in the internal settings, and many games don't hide the 640 x 480 option, if you want to run Windows in 480p for whatever reason you can use the GPU's control panel to force a true 480p output (or whatever resolution) to make your monitor either upscale it, run it as an non-upscaled image on a CRT or use an mCable/mClassic to upscale emulation, unless you have a modern display with upscaling similar to it, not that I'm sure how many displays use a custom SMAA based upscaler built in (my 1440p monitor doesn't so the mClassic is definitely useful for making it work)
not quite, the 80s typically used 240p for their games, they used 480i for Television content. unless you live in a PAL region, where 576i was used for television content.
People like to say that the graphics look like Vaseline is smeared on the screen. Usually they're exaggerating, but with Days Gone... it's the first time that would have been an accurate description.
Payday is very cool and runs great until you are 10 mins in an alarmed scenario and have tousands of death bodies chugging your frames 🤣 Such a good incentive for the silent approach lol
Well, I don't know when you've played Payday but I got it running on my GT 705 as well. I had to change few graphical settings inside the game"s files but it did hit 60-ish FPS with rare, but still present frame drops.
Payday 2 even scales above 100% perfect with multi-GPU, its really strange seeing 1 CPU thread used out of 32 but but 2 or 4 GPUs being used with impressive results (unfortunately I don't have a CPU that could clock like 10-15 Ghz as would be needed to see if 4x scaled twice the frame rate of 2x GPU, but it did have better fps). My build testing that was a 4.2 Ghz Threadripper for its 64 PCIe lanes which is pretty much needed for 4x GPU. For other game scaling references on Crysis 3 at 4k Ultra my 4x Fury does just over 100 fps (2x is about 60 fps, and 1x was 30-35 fps) comparing to a choppy 45 fps on my Titan Xp at max fan speed for maximum boost (the Fury cards I actually downclocked below rating in that 4x test, for max power consumption runs I get them about 15% faster for 1.7 kW of power draw but that's really not needed unless you have a 120 Hz 4k monitor on Crysis and don't want it falling below that). Payday 2 is very strange, FX architecture even did fantastic and 2x GPU with an FX-8350 is a good idea. Oddly though I did tests at 6 Ghz with an FX-8350 (phase cooled, peak core temp of negative something) and compared with 2 cores on vs 8 threads at 5 Ghz and Windows hogs so many resources the 1 Ghz extra was still slower since apparently Windows needs like 2 threads minimum all to itself just to not make the game choppy, and 5.5 Ghz 4c/4t tied with 5 Ghz all unlocked** **note the architecture and wording, in these cases I'm specifying which active module clusters I'm using if you are really in depth into it and following the architectural layout chart you can see even more detail into how the test was performed with pipeline sharing, however it showed even cutting recourses such as the L3 cache and particular compute pipelines had minimal impacts to the point fully enabled was fastest contrary to popular misconception for Vishera architecture.
The gt710 holds a special place in my heart. No matter how poor it may perform relative to other cards. It got me years of enjoyable gaming no matter the graphical settings. Now I own a 3080 and can play any game on pretty much max settings but still I don't find the joy the 710 gave me.
@@Drakorvich still got a laptop with it, the only good thing is that it can last for 13hours on battery and it plays wot blitz at very low at 60fps. undervolted it to 0.585 volts, during cinebench it consumes a whooping 1.1watts of power
Sadly it still won't, I no joke have overclocked computers for faster Excel performance, but you just can't fix horrible optimization (didn't mean having sub-zero cooling and ludicrous voltage and clocks didn't help, it just didn't fix the code). I think one of the commands such as arrays of color coding are partly computed on the GPU (it seemed to run twice as fast on my Titan Xp than it did on slower Quadros), but CPU is what I had at below icy temps. PowerPoint often runs terrible but overclocking can help a little with that especially in animations where it might be struggling (also turn off power saving, newer cards are better about dynamic frequency changes but Windows seems to slow down their clockspeed ramping for those short animations so the first half might have hitches if you're not in performance mode). I know that was probably a joke, but its a real problem. I think I had a video up (different channel) where I was like "Microsoft paint runs OK!" and its got a noisy refrigeration compressor bolted to the bottom running it like -35 to -60c on the CPU, lol!
Wow! I never even knew that there was a 705 lol 😂. At least it’s overclockable. That days gone resolution looked like a 3ds game with rtx😂. Nice vid man! Edit: thank you so much random your one of my favourite pc gamers thanks for highlighting!
I think it says alot about you as a content creator that I'm never ever going to use or own a 705.. or buy a broken PC on ebay.. and here I am watching every video you make!
Another proof that nvidia was just rebranding the low end 400/500 cards for a long time without doing anything in terms of performance. The GT 705 is basically a GT 520 with a 8% increase in clock,
I just want to say I do really like the face cam, I personally struggle to hear very well and lip reading can sometimes be helpful. In this instance it is.
Good on you lad for showing people that they can still game even without the best gpu. I met a awesome girl in vrchat last night who was using a gtx970 at 72% SS and she was averaging 30fps. Cheers from Florida!
When a prebuilt system is just as good or better without the graphics card in you know you are onto a good value product. It does show that most manufacturers don't actually know how bad the lowest end cards actually are sometimes when the rest of the system for the time has reasonable specs.
I have the same GPU with I5 4460 paired with it. It's bad, but not impossible to game. I mostly play TF2, I've played through Borderlands, Dusk, Just Cause 2, and many source based mods. I would mostly fiddle with game's files to make it run but frankly, it was fun :). Great vid btw, funny seeing my shabby GPU in some AAA games.
On my GT705 (no cleaning for 7 years though) under some 60-100% load on default fan speeds and basically default clock it maxes out at about 82c at around 30-35% fan speed. 50% fan speed would keep it at 75-77c.
Hey, I'm a long time viewer and I just wanted to say: my rig is nothing to brag about: GTX 1650 MSI Superior OC ryzen 3 3100 16gbs DDR4 (3200) but fucking hell man, I love these videos, it puts alot in perspective for people who love computers and people who don't know alot too. I love watching budget vids way more than pro setups cause it really showcases the tech that can still either do well in newer and or taxing titles and ones also shows the ones that eat shit 😂
If you think that card is poor for systems. I have a Ge-Force 210 collecting dust bunny's for Easter egg's. It's so weak, it won't hold paper down from a fart on taco Tuesday. Thanks for the shows, and the time to make them for us to enjoy.
"Powerful computer with Nvidia GPU". Yep, you know it was probably advertised as such in BestBuy too, sad part is consumers don't demand to see actual specs they will see the green label then thing "oh it must be good" and then later go "oh all Nvidia GPUs are terrible" (or any other brand, AMD/Radeon/Intel/ARM) and its really quite bad for the company image to even allow putting the sticker on there (which is why the OEM bought it for marketing), but it really destroys company reputations and will likely bite them in the end.
I still have that card somewhere. It came in an Acer prebuilt as the motherboard had no display output. Replaced it with a 1030 when the drivers broke and it's a world's difference. This is like the only channel that talks about this obscure card lol.
Good video. It's not really just a display adaptor though. For example, it has nouveau drivers (I do believe) in Linux and so for games like SuperTuxKart, it could be a rather nice card and with reasonably low power-consumption for its age, especially if using a (cpu) i5 hd4440 or hd4460 whereby you want to replace the intel graphics with a similar spec and no longer use up shared RAM for graphics.
This makes me appreciate the GTX 750Ti, I remember playing The Witcher 3 on my 750Ti and it was perfectly playable and enjoyable. Did have a pentium G3258 at the time though, so that was pegged at 100% and stuttered like a beast.
@@johnjohnson7743 no 50% is 360p, but since the 50% applies to both the horizontal and vertical, the total pixel count is 25% (230400 pixels vs the original 921600)
@@lmcgregoruk ok, by that logic, what is 50% of 1080p, now what is 50% of 2160p, or “4K” and what is 25% of 2160p? I do hope you’re starting to realise that your reply was right about one thing, but massively wrong about another
@@johnjohnson7743 It depends on how you look at it, both are right. You can measure per axis, or per area. Sadly games don't state what they use. nVidia uses area for DSR. Scaling by axis is what makes a smoother picture, but scaling by area is closer to difference in performance. 50% of 1080p, or rather 1920x1080 would be 960 x 540 if we go by axis or 1357.6 x 763.68 if we go by area. 25% would be 320x180 by axis or 640x360 by area. for the "10% of 720p" we get 404.8 x 227.7 if we go by area. for "0.5% of 720p" we end up with 90.1 x 50.9 And for the area calculation we need some square root magic. Oh and back in the day when supersampling started to become more known, it was said to use full integers (like 2x2 or 3x3) for best image quality. (with 1.5x1.5 being acceptable as well)
On the last CareyHolzman's stream, there was a short talk about TimmyJoe and some other of us TechTubers so I made a small chat donation and mentioned your channel together with Dawid Does tech Stuff and TechSource. Hope you don't mind. Btw, I haven't seen you on GamingTribe, don't you have a profile? Ivan Marijanović
those 705 710 and 720 oem cards are super handy to have around if you even need to troubleshoot a pc, I have a cheap little Pegatron 720 I use for this purpose just a thought!
That 14 fps will forever be a nightmare to me as back in the day I tried playing gta4 on a PC that absolutely could not handle it and that was usually the highest I'd get at night only. Daytime traffic and I'd stutter as low as 6fps.
When an architecture is no longer being supported, NV(AMD) usually said "we no longer support GTX400~500/ GTX600~700(R* 200s/300s)" to consumer. But they actually stop support on Fermi or Kepler. But due to mix use of different GPUs (even on a "same" product), GT(X)700s may have from outdated Fermi to pretty advanced Maxwell. This can give consumer great confusion. BTW, I did use GT705 and G(T)605 PCs in my middle schools. This kind of GPUs are "Please remove your dGPU to get better performance". Also I have a GTX745, luckily it was a GM107 GPU. With weird D3 128 bit 4G VRAM.
Lol my current Dell has this GPU. Currently using it daily for home working. Works fine! Did put in an SSD though or it would’ve gone out the window years ago. Did try Apex legends and DCS on it and even on the lowest settings it turned into a static slideshow. Lol. I think the time has come for an upgrade! I think Dawid would agree.
Gaming at those resolutions would make me nauseated..... literally, as I do suffer from motion sickness and often use Dramamine before I play any fps or driving simulator games. AND that's when the resolution is good let alone as bad as that graphics card makes it! 😵😖 Liked the video, by the way! 👍👍