@@DawidDoesTechStuff Aside from the fact that you used a 30 series, It pretty impressive overall. Im guessing you would get similar performance out of a much cheaper lower spec gpu. Some people with super small pc's may be able to take advantage of this.
@@DawidDoesTechStuff I was interested in trying something the other day, it was a daisy chain I thought of; Sata to m.2 and then m.2 to pciex4... Of course you would need to run the OS from a fast form of usb. Not sure if it would work but I was bored and thought of trying it since I already have a few different sata to m.2 adapters. Sounds like a cool idea for a revisit for you tho, and costing less than $40.
How about a core i9 10980 XE with two voodoo 2 in SLI? I'm pretty sure I've seen motherboards with like h310 chipsests and ISA or PCI slots, just because of some industrial machine that still needs it.
@@user-le8ul4nr5t an i9 strapped to an SLI or a XFire configuration is ludacris! Especially with any of the recent GPUs. In every setup I did SLI or XFire, it's was easily beaten by a single GPU setup in nearly every situation, the games don't like to do SLI anymore, too much data throughput during processing SLOWS DOWN the output because it has to mux the output, adding extra processes that wouldn't otherwise be present in a single GPU setup.
@@beaumontlivingston8084 we're trying to achieve the biggest bottleneck... Also a voodoo 2 isn't exactly a new card, SLI still made sense back when it was invented.
@@Izmond Nah, he just tries to act cool... actually he's a good boy, trying to help old, disabled hardware in ways never meant to be... it's just the result that looks cruel.
I'm not sure, but I would assume that games that tend to be more CPU bottlenecked are going to have relatively higher PCIe bandwidth and thus perform relatively worse, and that something like a GPU TECH DEMO that is tending to buffer more stuff in VRAM would do better relatively... of course, you can be bottlenecked by a single core so it's not quite that cut and dry but this is mostly for fun anyway.
Very interesting! Would love to see a follow up investigating how much GPU power it actually makes sense to use in a setup like this. I think you're probably right that GTX 1650/RX 570 would be about right, but would be very interesting to investigate!
This type of setup is a lot faster if you are able to connect the exp gdc beast via m.2 wifi interface. I was able to play GTA V on max 1080p well over 60fps using an rx 5700.
@@gamingtalent2888 I do not have video or pictures atm, but I may upload one in the near future. If you're on dawid's discord, I'll dm you pictures of my setup with it.
Yup, both the M.2 A key or M key model can give X4 instead of X1, as long as you're willing to give up inbuilt Wi-Fi and use USB Wi-Fi adapters (hell YES!)
with that limited bandwidth, you might get very similar performance with something like a 1050 ti, of course depends on the game's bandwidth utilization
I did this setup for a 5 year old Dell laptop, and a smartcard pcie to the same type of setup. Had a GTX580 with a i5 2000 something U. This hooked me up until I could afford a new game rig. These adapters are so underrated.
Recently the YT algorithm presented your videos and your content is unique. This video among the most creative and off-the wall! Keep up the good work. Your recently minted subscriber.
The EXP GDC is actually pretty competent. With the correct m.2 adapter you can actually get 4 lanes which really helps. You can power it with the brick power supplies from Dell optiplexes of old and you can use an adapter cable to then power your GPU's connectors from the GDC, and the whole thing powers up and down with your computer.
What you really should be checking and I'd like to see, is a video where you checkout which graphics card fits best for this adapter. Where you lose out as little as possible on the bandwidth cap.
Take the plastic cover off the x16 adapter. This allows the card to seat and lets the MOSFETs "breathe". I run 2 adapters like these (that didn't come with that cover) for my Folding @home rig.
You can also do the same thing with a laptop that has an ExpressCard slot, same PCIe1x limitation exists though, if you find a computer that used PCIe 4.0 you'd see 2 the performance from just one lane of PCIe, but most if not all of these have switched to M.2 with at least 2 lanes of PCIe(WWAN slots might be 1x but IIRC will only work with WLAN/WWAN) I'm assuming you've already done this, but in case you havent, well, here's a fun project, and can provbably use much of this hardware for another video
My son and I used the beast to build him a gaming laptop with a GTX 960 2gb. He’s 6 and he loved taking the old laptop apart and learning what everything does.
that 3070 looks so good, its probably cheaper than msrp and thats fine because that card is so utilitarian and powerful, i wonder how i can get my hands on one, imagine if dell sold these cards, would be amazing value
the bandwidth also gets cut in half because the signal is getting sent back to the internal display, you could definitely get more performance by hooking it up to an external monitor
I use a m.2 nvme to pcie 4x adapter and i have like 18% performance drop with my 3060. I think thats actually pretty acceptable when taking in consideration that I mainly use that gpu for tensorflow which only seems to perform around 2% worth than on my friends 16x pcie slot.
I have the ADT-Link version connected to a "DeskMini 110" and it rips... also the connections and setup are much cleaner as you use a dell power brick to run it instead of a whole PSU like that.
@@dakoderii4221 no, I speak about using the i7 3770 with a proper cooler and some proper ram, if you manage to do sum overclock I bet that there will be not much of a bottleneck, like max 30-20%
I use one of these with the expresscard to make my old dell latitude more usable in 3D CAD with a 750ti. Expresscard makes it extremely easy to use, its not quite hotswappable but i never have problems when unplugging it or plugging it back in (i do have to turn the pc off though), it always recognizes the card just fine. And with a lower end gpu the bottleneck isnt as noticeable, just from experience i would say the loss is less than 10%, of course the 750ti is quite a low end card these days but its an incredible upgrade from integrated intel graphics and for something that takes 5 seconds to plug in and one reboot its great.
dont blame the cards, blame all the motherboard makers for not putting standard active PCI E x4 slots instead of the lowest x1 on everything even today. for reference dont forget the interconnect port speeds are around... Mini PCI Express has a speed of 2.5Gb/s PCI-E 2.0 x2 6.4 Gb/s (800 MB/s) PCI-E 2.0 x4 12.8 Gb/s (1.6 GB/s) PCI-E 3.0 x4 31.5 Gb/s (3.9 GB/s) see even the really old PCI-E 2.0 x4 gives a card reasonable breathing room for an x4 10Gb/s ethernet card etc ,demand all motherboard vendors replace all x1 with active x4 asap going forward.
If we assume that was a gen 3 PCIe then that means that the max bandwidth was a whole 1GB/s each direction. Talk about some crippled performance. I bet the card was doing all of that while still taking a nap.
@Dawid Does Tech Stuff This was my setup when I made the switch from console gaming three years ago, except I was using this adapter on my a laptop with a ROG Strix 1080
Max to this adapter I would say is something about 1050 Ti. For a newer hardware with M.2 slot it's also available adapter which give us about 4x PCIE lanes that can make sense.
The EXP GDC is the only adapter with proven compatibility and works best from all of this type of devices. But its practical side is highly debatable, because it's not that cheap when you need GPU, adapter and also the power supply. At this point maybe better will be change the entire notebook.
Damn, I'm late to the party! but- if you ever see this, I'd love to see a pt.3 with either a second fan on the cooler or watercooled (it's opened anyways) also, I'm not sure if i'm right but.. how much bandwidth does a USB 3 port have? there's two of them (unless they share bandwidth)
You can tell the GPU is practically sitting at idle given the temps are around 44c. My 2060 is sitting at 50c just watching youtube (I live in a pretty hot climate around 35c most days)
you can also try pluggig adapter into m.2 port for ssd in notebook. ALso big thank you for recording screen on camera, every lag is pronounced so it can be better seen-
One really only has to play a game that allows to crank up the post processing eye candy to the point where the interface bottleneck becomes less relevant than the limitations of such a card, to achieve peace of mind. 😌