Audio ain’t that bad. Seems like lavalier mic failed and had to fall back on the mic on camera they use for syncing audio to video in post. Sure it’s soft and tinny but ain’t crackly and buzzy.
Well, that was an unnecessarily long "don't buy this" video that did a really bad job at the actual performance part. 1. Test using multiple games since Valheim was clearly having issues running on that laptop no mater the gpu. 2. Test using multiple laptops to find out if it's a eGPU problem or laptop problem. 3. Test the eGPU connected to the same desktop system you ran the normal 3080 in. 4. As the eGPU was disassembled I was at the very least expecting to see it tested inside that desktop machine.
@@pcguy98 I was yelling at the TV the entire time... WHYY VALLHEIM!!!! both me and my GF get pretty much the same game performance (depending on location in the world) w/ me and a RTX 2060, and her 1650. It isn't till you're out on the ocean, when system utilization actually goes up and framerates differ between our systems!. I'd be super pissed if i was GIgabyte right now.
@@jamesarmenti8876 this is shortcircus, a channel for short and "quick look" at cool stuff. Essentially, hey we got this, it cool, bye. If that is not your kind, you could just unsub it and just watch LTT main channel, where things are reviewed the normal way
@@khangle6872 I know what short circuit is. It still doesn't change the fact that videos are skewing opinions based on flawed testing. Short or long.. its the principle.
That would be a good idea actually. To create a port that is almost usb sized but it is actually a pciexpress header so you can plug in an egpu directly to a pcie connector.
People: Waiting for GPUs to finally become available. Aorus: Let's create an eGPU that 10 people worldwide will have use for because we have so many spare GPUs at hand.
I think more people will buy this eGPU and use the card in a desktop than using ist the way it was meant to be... 1400 bucks seems not to far off the momentary prices an you get watercooling and dont need to upgrade your power supply
Allot of you will want to rain on EGPU's, without seeing the worth. Yes TB is the main bottleneck. But the value is, greatly increased performance in any laptop you come across. Future proof. College life.
What model XPS is that? I love the concept of TB3 eGPUs but there are a lot of gotchas involved. For example, is the TB3 port 2 lane or 4 lane? Is it connected to the CPU's PCIe lanes, or the PCH's? If the CPU and Ice Lake that has an integrated TB3 controller? All these factors can drastically affect the performance of an eGPU. I with LTT would comment more on these factors as there seems to be some mixed messaging around eGPUs on the channel. Take this video showing the weakness of the eGPU when paired with this particular laptop, but contrast that with the diamond play button PC build where Linus comments that using an eGPU is an option. Yes, eGPUs can be complex but LTT is often pretty good at tackling those.
i didnt see the e in egpu at first and i was confused that they had made a graphics card like that weird asus concept card that i dont think came to market that had an all in one made in to the shroud and not like the evga kingpin 3090 and other cards like it
Man the audio is off in such a strange way. Like it sounds kinda low quality/bitrate, but I don't think it is. It picks up on quiet noises on set reasonably well (like the unboxing sounds, cloth sounds, etc stuff you may not hear when using a lav, not necessarily unwanted background noise but I assume the studio is mostly quiet when filming anyway) in a way I kinda like. But Alex's voice is just... off. It reminds me of webcam or built in camera audio. There's far too much emphasis on the low frequencies in his voice, it makes it almost sound like it's peaking (but as far as I can tell it isn't). The main thing I hate is how reverb-y everything sounds. It sounds like they're recording in an empty room with no sound-deadening. It's kinda hard to explain why Alex's voice seems so off. All that written, it didn't really detract from the video at all after a few seconds, just the initial surprise. Not sure if this counts as constructive criticism or not lol, I really do like the video tho! Edit: 3080s also have power limits well in excess of 300w! 330 is reference iirc, with some designs going to 370 or above.
@@iwontliveinfear and Valheim runs like dogshit on literally everything depending on how terraformed/built an area is, w/ super low system utilization! My RTX 2060, and my gf's 1650 perform almost identical, till you end up out on the ocean in the game. Major optimization issues!
@@jamesarmenti8876 yeah I made a similar argument. I think the title used was more of an issue then the unit itself. They should have tested across multiple games as the egpu is not a standard configuration and requires different tuning for optimization then a standard card in a mobo. So in a game with poor optimization and drivers that haven't been updated for that game. An egpu is going to take a much bigger hit. As it doesn't have the bandwidth to just muscle past bad coding.
Yes I would love to see a test with : - normal desktop 3080 - 3080 egpu connected to desktop via thunderbolt - 3080 egpu connected to desktop via pcie This would allow us to see if performance drop comes from thunderbolt or mobile cpu bottleneck
Too rushed, and I am not talking about the audio, the "review" was awful. Clearly it wasn't properly tested and yet you jumped into conclusions. You are hurting your audience and the product manufacturers.
But it wasn't a review... was it? I don't disagree that I would have liked some more details/testing to point to what may have caused the lower than expected performance. I feel like this channel offers something in between an unboxing and a review.
@@joshuacoggins1787 I agree, but this video has all elements of a review. I know this channel is for doing "quick vídeos" from a production standpoint, however this is not an excuse for a company to publish such a bad quality content/misinformation. This is wrong in many levels.
@@egrinant2 Fair. It's an interesting problem. On one hand, there are people that may use it exactly as they illustrated here, plug it in and think it's not working or not doing much and then be upset... but on the other hand digging into the problem should be shown but could cost too much time to go into this video. I'd agree that the larger issue is the conclusion drawn from what feels like incomplete information.
I would love to see you put this card in a normal desktop and benchmark it, just to see how the card itself and the cooling solution actually perform without bottlenecks
Did that with 1080 in old aourus egpu (was easier because it was pretty standard air cooled card with normal connector, I just plug it to my desktop and it worked). Performance difference ~25-30%. I am still running that 1080 in my sfx desktop I don't really understand why their experience is this bad. I remember my egpu was kinda bad before I installed TB firmware update, then it ran very smoothly. My primary use case for the egpu was taking it with my laptop to the conference and running vr no the showfloor. Ihad 0 problems with that (and VR is very sensitive to frame drops). Hell at home I spent close to 100h in vr running from egpu wtihout any issues.
Jesus Christ it's difficult to listen to Alex talk about eGPUs when he's got no knowledge of them. Try it for a headless mini PC with a thunderbolt connection.
really seems like Alex made up his mind before the video and didn't wanna waste time on this. They just wanted to take it apart. I would have rather seen games like GTA or Tomb Raider, ya know, ones they benchmark often and are more optimized.
It's short circuit, not an in-depth review, that's what LTT is. Sure, they could have done a better job, but they don't take hours to make these short videos, they have other things and projects to do for LTT.
This is the shit tier channel. Quick looks at tech that's cool and most likely a company reaching out to show off their stuff but not interesting enough for the main channel.
--- @@mbsfaridi Even though the duration could be over ten minutes, the truth still is that these "short"-videos are faster to make production-wise: Especially far less fancy-(effect-)editing alongside for the most part "wing-it"-recording-sessions ( as in, yes is some script. But otherwise there is no rehearsing and no "bloopers", so for the most part the "script" is to tell about the main-content of video, minus some scenarios where the intention is "surprise me" ). ---
A 3090 and 5950x. gets 20ish fps in an area with a moderate amount of terraforming and building. My old 1060 equipped laptop gets the same fps... Both the cpu and gpu max out at like 20% utilization. It's definitely not a good benchmark game.
@@Pr0fPyr0 that has nothing to do with unity, its the developers that cant use the quite good optimisation tools unity give you. You cant throw stuff into a engine and hope it will run perfect
Yeah my 1070 also has pretty bad framedrops in more densely built areas. Outside or away from large structures I can easily reach 80-100FPS @3440x1440 at very high settings but inside of buildings it often goes down to the 50s. And sometimes it just straight up freezes for a split second. Valheim definitely need some performance optimization. The freezes are the worst.
Honestly this was about the shittiest "benchmark" I have ever seen. "I have no idea if it's an issue with the CPU or the GPU".... That would have taken 10 seconds to check
@@RashidTak uh... Yes there was. That's my point exactly. People familiar with SC will get that it was just a quick un-scientific test. But Joe schmoe that searches for this product and sees this may not. What's the point of even doing the "test" if you don't want people to pay attention to it
unfortunately LTT has gotten too big to be taking request about small stuff :((( if you're really curious i'd recommend searching for small channels that might cover this in more detail
@@davidyang9902 no they definitely have the ability to do so and I'm sure they do stuff they see in the comments because as a public figure you have to actually listen to stuff the public wants and I've already seen like 3 comments asking about this in an hour
Fun fact: it is a normal gigabyte 3080 in there, the gigabyte 3080 also has those wierd connectors under the shroud, because the power cables are extended over the flow through design, lookup a teardown of it
Yep something definitely not right - in my experience possibly the CPU hitting thermal limits and bottlenecking the eGPU. I ended up having to use Intel XTU to limit wattage and the frame drops exhibited in this video went away
Also run all tests with the integrated display disabled to get an accurate look at what the card can do without the bottleneck of sending the display signal through the same cable giving it instructions.
@@nightcorefusion3884 I am using my eGPU with 3060Ti as docking station for XPS 15, never even tried to use it for internal screen. But to be honest I am shocked with level of incompetence in this video.
@@livemadseason I have a feeling when you buy an eGPU you'd want it to just work out the box when you plug it in, without having to disable both the iGPU and dGPU in your laptop. Also if you can't use the internal screen than the eGPU makes even less sense, cause you basically have to use it at home, where just buying a desktop would be cheaper
I don't think there's any kind of external power apart from usb-c pd. Laptop might not be happy with that. Some newer laptops won't run full tilt until you use a real power supply. (The results of the 1650 also looked a lot like a cpu bottleneck)
@@blubblub3786 More than likely it has to do with the number of lanes dedicated to the dedicated GPU on-board vs the number of lanes dedicated to thunderbolt... my 1065G7/GTX1080 eGPU is outperforming this by a lot.
@@Hyatice Thats kinda difficult to say, a PCIe gen 3 x 8 runs at the same speed as PCIe gen 2 x16, and there are some older videos comparing that (when people were going from gen 2 to gen 3). The real world performance was like -5%. So I think it would still run alot better than this, even if it was running at Gen3x8
I usually love Alex's videos but come on man. Sure I'm with you I have no possible need for a egpu BUT with about half an hour to an hour of research you can come up with half a dozen ways to improve that performance. I have helped a few friends who were set in their laptops ways, to go from what you saw to more like a 10 frame loss compared to desktop. It is more Windows default way of handling an egpu that is garbage. While you will lose some power nothing near what you showed here.
I found a massive difference in performance after installing latest drivers that match the GPU in my Razer core W 3070. So I would imagine the same thing is happening here
my razer core x apparently has pretty well optimized controllers id assume maybe would be the difference because i get rediculous fps with my 2070 super and an 1165g7
Your review is of Valheim, not of the performance of this product. Valheim literally cannot handle high FPS unless you are in an untouched world or the middle of an empty ocean. Please be sure to include at least one game in future reviews that is actually capable of scaling on better hardware.
Really bad testing, there no way an eGPU would actually do this poorly. You needed to invest more time into checking so see whats the bottleneck or what the issue is.
Biased review. You went in looking for something wrong. If you getting that framerate, you did something wrong or did not do due diligence on proper setup.
1. eGPU supposed to be used with external monitor 2. eGPU supposed to be used with the TB3 40Gbps link 3. eGPU supposed to be used with the powerful mobile CPU like 6 or 8 core i7 or i9 Educate yourself lamers.
Watched for the tear down. I fckin Laugh at this pathetic video. "Lets test on one game." "compares Gaming PC (tuned motherboard, ram, pci slot card). To a Laptop (basically handicapped barely functional with good Grades on paper). Didn't even plug the card into the PC to see what happens
honestly really suprised you guys didnt put that in a normal desktop to see how well or not well it performed personally thought that that would have been really coo l
@@Innuya They may not perform full benchmarks, however once they start putting up comparison FPS figures they should at least do some basic troubleshooting when they get obviously incorrect performance like this. eGPU's aren't this bad, and I can't imagine Gigabyte will be very happy with this representation.
Game is clearly trash and just CPU bound, or as someone mentioned powering the laptop via usb-c would also have contributed to lower CPU performance. The desktop blew it away because of CPU speed. egpu's have their limits and downsides for sure but this is NOT an accurate reflection of that, it's way exaggerated.
why did u use it for literally ONE scenario and didn't try doing renders or play game that aren't this new or simply bench it to get a raw power comparison
Let's get an eGPU and use it the most stupid use-case that doubles its bottleneck. If you have no clue how Thunderbolt works, yeah don't buy an eGPU. Asus's custom non-Thunderbolt eGPU is what is actually needed in the industry, but is still only 8 lanes PCIe. We still need a full 16x solution that connects directly to the CPU.
This is a terrible and lazy assessment of the eGPU. You only tested 1 game on 1 laptop, and (on camera at least) put in zero effort to determine the cause. There are so many things that could have been causing that poor performance other than the eGPU. I know this is only Short Circuit, but I expected even a little diligence. It's clear that you don't think eGPUs are worth it, but they make sense for some use-cases. Aorus was probably pretty pissed at how quickly they were to dismiss and insult the product without giving it a fair assessment.
They didn't bother to test the GPU with other laptop or games... I feel like there's something wrong with that set up, driver/compatibility issues or something... Very unprofessional even if the product doesn't makes sense
idk prob has to do more w valheim than with egpus, might try it, I got an enclosure idk around 2018-17 and for me it was a game changer today I use the same enclosure with a 3060ti and I get consistent fps in most games. Something to keep in mind is that is obviously not gonna be the same as a desktop u got 2 bottlenecks the TB3 port and the CPU on your laptop I get only about 70%- 75% of the fps compared to a desktop but keep in mind that my notebook has a 4 core CPU, people online recommend having a laptop with a 6 core CPU, if u want an egpu, but since most laptops that run those CPUs are gaming laptops already I dont see the point. Used to be more relevant precovid since i commuted to college with a sub 2 lbs notebook and was able to play games at my apartment. idk why ltt always trashes on them you are not gonna buy an egpu when u got a laptop that can game in low setting already.
While everyone says "put it in a REAL PC", I'm myself wondering: Why wasn't this tested with a Tiger Lake laptop? Ice Lake / Tiger Lake solved many of the TB3 performance issues by finally taking the chipset out of the equation and having it connected to the CPU, and the performance gains are considerable.
I get that Alex likes Valhiem, and it's valid but Valhiem is not exactly a great benchmark since it's in EA and not optimised. So Im a bit disappointed with the benchmark.
testing on an early access game that has frame rate issues depending on how many instances are in a current rendered chunk (animals, enemies, terrain modifications, etc) was an odd choice
I know right, that's what I thought during the whole video "Surely they're going to test a different game or workload" I get that this was a laptop but they could've installed Doom on it at least
@@butterdubs2267 true, it just doesn't help as a reference when his Valheim performance could be wildly different from others. A game that's more featured on the channel like DOOM would have provided something more "stable" to compare to, but I digress. This isn't a review and we already know external GPU's have a performance penalty. This video was about the odd configuration/build. I hope they expand on it.
@@butterdubs2267 valheim doesn't perform consistently... so it absolutely makes a difference. Trying to show a difference in performance is impossible, as the game can perform differently every time you go into an area.... it was a terrible choice of game for a comparison like this!
I would love that, also idk why LTT always trashes on eGPUs, i think they miss the point of the value proposition the have. I personally own a setup with a 3060TI and use it with a notebook i can play games at 1080p high settings compared to not being able to get more than 10 frames. Precovid i was able to commute to college with a sub 2lbs laptop and game all i wanted in my dorm. Its true that if gaming is the most important thing to you then eGPUs don't make a lot of sense compared to a gaming laptop or a dedicated setup especially for the price, but for me the most important thing was having all the convenience of a thin n light and gaming is a side thing.
@@MixableRat90 the G14 is literally designed for people like you .. the 3060 in that will perform just like your 3060ti after the TB bottleneck and perf loss due to a old 15W mobile intel chip. also, 1080p high is not much of an achievement these days ... especially with a GPU as fast as a 3060ti.
wish you would have tried some other games. The one you used isn't really optimized yet and as such a more common title might have done far better. I would like to see a real attempt at bench marking this.
@@TheSerbianEmperor the egpu isn't a standard configuration so the way the drivers handles it are different. So a title that is less optimized is more likely to show how much harder a card has to work. It's kind of an apple to oranges comparison. There is a lot involved like bandwidth and so forth which is why performance dipped even more when when using the built in monitor. This requires optimization to get the best output over the limited bandwidth. A game that been around for a few years which has been fully optimized with matured drivers would likely have given a more fair impression. It seems he just picked a game he liked and ran with it. Instead of using multiple titles to give the unit a fair shake.
Wait did they only plug the capture device directly to the external and not the monitor? There is no way they actually released this as a product if it performs that poorly. Like I wasn't expecting it to do great but not 30 fps. Not recommending it even if it can do better tho
That power connector is something gigabyte is using on their 3080s (except the aorus line), with that flat to standard 8 pin plastic adapter jammed there (their 8pins are not soldered to the pcb)
Haha, that's the way I read it at first too, but I think they meant they were unemployed BEFORE working at LMG and it is just the name they used for the credits since they haven't passed their probation period yet.
I'd have thought it was inevitable that it'd be hopeless if you ask it to send the resulting rendered video back down the thunderbolt connection to the laptop.
Can we take a moment to appreciate his Valheim home though Though as literally every other person has said, Valheim is a very poor performance measure until it gets some much needed optimization
Hmm, would say get an ITX system instead of an e-GPU. Also was this the best way to test it? I don't know enough about Thunderbolt docks or the display out but I would've thought plug DP into GPU, Thunderbolt into the laptop would yield better results(?)
Wouldn't the performance be better if the monitor is plugged directly to the gpu? I imagine there's some extra cpu strain to move framebuffers around, hence the performance drop. Maybe thunderbolt works differently than I imagine
I saw teardown of a 3080 somewhere on youtube, where the 8pin connectors were mounted in the end of the heatsink and then connected to the board with similar connectors like this. It could be a standard card, at the end. (Could been on Der8auer) The "don't by eGPU" tipp, was to late for me. I love the concept of eGPUs and after seen some Videos of it in LTT in 2018/19 I thought they where usable. So at the end of 2019 I bought a Dell Inspiron and a Razor Core. Put my 1070 in it. It sucked! Audio problems. VR lost tracking all the time, the Nvidia driver took too long to handle interrupts, every thing time sensetive was unmanageble. In January this year I bit the apple and sold the laptop and the enclosure at a loss off over 500 €. Bought a 3600X instead and I am Realy happy now, still using the 1070 though.
You definitely did something wrong because I've done the whole egpu thing and gotten excellent stable fps. My guess is a driver issue / power issue or cpu issue.
Yeah, all the Techtubers are excusing themselves now a days for making content for things you can't buy. It's been some shitty months for tech to say the least. I was lucky to grab me a 3090 before Xmas. But I'll still watch the videos😉
@@fredrikstad01 I literally built my rig just before the shortage started and oh boy am I glad I did. My rig is up in value 40/50% now just for the CPU/GPU themselves! (ryzen 5 3600X and a mere Radeon RX 5500XT 8GB which now retails for the absurd price of ~600USD)
"that's like pretty extreme" try borderlands 3 in badass 1080p with vsync off (you should be in the 200+fps with a 3080 and this will cook your gpu way worse than furmark or whatever) did not get much info out of this so disliked sorry, topic too complex for a lets try this for 5 minutes and call it bad
I think a eGPU is aimed towards a very small group, I belong to his group and that's why: I have a very big workstation (threadripper, 128gb RAM, sadly still 1080ti, wating, praying for a 3080ti?) this thing alone is about 50kg, CM Cosmos C700P But I also need a small, light still powerfull laptop, to I chose the Blade Stealth GTX1650, ok for light gaming Now for my workshop I just take the Blade and plug it into the eGPU / dock, and everything works well with just 1 cable And also for LANs this setup is light and "portable" compared to my workstation Performancewise I have to say, dependend on the game I lose almost no performance, in most cases it'a around a 5-10% fps loss compared to the desktop I am very happy with this eGPU setup and it works for me In your case, consider the cpu and everything else, just comparing e3080 vs 3080 is very lazy I am pretty sure you were cpu bottlenecked
Hmm. You gave the impression that this eGPU is trash when it's in fact fantastic if you know what you're actually doing with it. Yes, it won't give you full RTX 3080 desktop performance but in every single game I play, it more than matches a 3070. I use it with a Razer Book 13 and even using the laptop display it's amazing. Just make sure you have the latest drivers installed and you're using the extremely short Thunderbolt 3 cable it comes with (the longer it is the worse it will perform) and you can't really go wrong. Also always connect it to an external display from the back of a gaming box if you want to get the most out of it.
Even though this has been true for like, two decades, I feel it needs repeating: 👏DON'T 👏 BUY 👏 GAMING 👏 LAPTOPS 👏 It has never made sense. If you can afford the price of a gaming laptop (and insist in wanting a laptop mobility) then buy a gaming desktop AND a cheap laptop for travel. It will be cheaper & better.
Can someone help me? when I tired run a game it showed an error message (0Xc000007b). I googled it and tried every methods, but it only got worse. The things I tried Installed multiple visual c++(both x64 &x86) Changed xinput1_1.dll,xinput1_2.dll,xinput1_3.dll,(both 32 and 64 ) After this the games that used to work also started to show the same error .pls help
"If you push them really hard they might consume 300 watts" Yeah you might wanna double check that buddy. They sit around 330 watts in games and can spike over 400
@ShortCircuit ... I believe ""Alex"" is mistaken in assuming that the "E-GPU" is made so that You can "Bring Your GPU with You"... The main "target audience" for E-GPUs are people who use laptops as their "Main computers"but still want to "game properly", i.e. having lots of "eye-candy", high frame rates and probably a "larger screen". And by having an E-GPU AT HOME they can still opt to have a laptop that is comparatively light and nimble suitable for use "while traveling". Best regards
yep, FPS consistency in valheim is just not a thing at all even if you're in the exact same area... my system utilization never goes beyond like 70% unless i'm out on the ocean, and at that point my gpu hits 90+% and the FPS goes beyond 100. For the most part it sits around 45-70fps (completely random) on a rtx 2060, depending on how much terraforming /building was done in an area. The game needs major optimization.