@@florianlucs7229 what if they hit the die, a memory chip, or some power management circuitry? If it's a multilayer PCB, they could have mushed some traces together internally. There's a lot more than soldering traces back together.
I found Easy way split GPU to both system runs on windows 10 I split my GPU into two and me and my bro used 1 GPU to play multiplayer game Borderlands 1,2, pre Sequel multiplayer we made two users in my desktop and run space desk and install persec on second user and use login to second user and connected second user to laptop and then i switch to first user he use mouse and keyboard on laptop to play game using parsec and I use desktop to play form desktop his laptop is so low it has no fan celeron also has no 4 gb ram and hdd Hard drive we played all game smooth using this method no vm not complex it's simple and
Or you could do what I did, get a 64-Core Epyc Rome, 256GB of DDR4 ECC, 3x Titan X Pascal and 10 1.2TB Enterprise SSD to run 12x Gamers off one rack mounted system. Awesome video Colin (and Linus). This tech is incredibly powerful for distributed computing, and I can't wait to see what the future holds.
Does this actually work? Meaning using multiple GPUs and doing this? Me and some friends often meet up for league of legends and games like that. Usually with 5 people. Instead of them all bringing their fulltower they could grab any ultrabook and another brings a second gpu and we could chop it up. Like I have a 3080. My friend has a 3090. We could take 2 streams from his gpu and one from mine and it still works?
I was about to buy one of those GPU servers to start up a VM provider, I almost bought up the one with six 2070 thinking I'd host atleast 6 clients for now... That was when splitting GPUs were not really a great experience yet, I ended up just waiting on it, stocking up on gold while the world started brunning down. I kinda regret it considerin the GPU price right now, so now I have to wait for them to go down again, and I'll do this except split them in two!
this might be an interesting one for households with a kid or two kids that are looking to move from console to pc. Having a hub pc for the household is an interesting concept.
I'm surprised nvidia hasn't jump on this would encourage people to buy more expensive gpus. I own a rtx 3080 but would happily waste the extra money buying a 3090 if this was possible with out the work arounds
well yea, but i just thought of a better idea for home. i mean my wife sometimes wahts to play civ6, but not htat often and when she palys, she likes to do it on her abit older and slower laptop. well i havent had the need to upgrade it because she does not play that much. but she could log in to the VM on her laptop and run the game there. This gave me a good idea.!
@@AndresUffert2 Exactly. It still looks a bit too fiddly for me to bother but I can really see this being a great way to make a home server (even more) worthwhile.
I mean, looking at performance numbers, I’d expect that any midrange card would still significantly outperform this VM. Plus, if you’re playing competitive games like CS:GO, the latency would likely be unbearable.
do you see them running 4 games in 1 instance playable for 4 players and the game there running is cs-go an extremely easy game to run...just go buy 4 haswell systems and GTX 970's cheaper then 1 3090 the only reason your even seeing a video like this is cause linus has been paid to make it by NZXT not because its a good idea or good value
Oh most definitely. Exam browsers as well. Took a PSI remote exam and their exam browser detected that I had VirtualBox installed. Even though the service was disabled and all. Only thing that worked was uninstalling it.
I gamed in a VM with PCIe passthrough for a year until some of my most played games stopped working because of anti cheat and there were some quirks that were annoying too
I feel that as a kid linus' experiences with being broke and wanting to game coupled with his experience as an actual company owner has really helped his want/need to get around arbitrary bs like this, it's really really solidly good for gamers.
I found Easy way split GPU to both system runs on windows 10 I split my GPU into two and me and my bro used 1 GPU to play multiplayer game Borderlands 1,2, pre Sequel multiplayer we made two users in my desktop and run space desk and install persec on second user and use login to second user and connected second user to laptop and then i switch to first user he use mouse and keyboard on laptop to play game using parsec and I use desktop to play form desktop his laptop is so low it has no fan celeron also has no 4 gb ram and hdd Hard drive we played all game smooth using this method no vm not complex it's simple and
I think leaving the host VM preview open is hindering the performance a bit during the demo. It should be closed and the VM should be started from the menu option instead of opening the preview for a more realistic result
@@wakannnai1 I don't *think* the 3090 is able to max out 16 PCIe lanes, we have yet to see GPU's hit those limits. Typically those limits are only reached with enterprise hardware (networking, storage, etc.). The biggest bottleneck is undoubtedly Windows overhead (as Windows is running 4 times on the same CPU/GPU), with the second likely being storage (though that should not effect in-game performance). Either way -- like Linus said it is never going to compare to running on bare metal as long as you have to connect your display and peripherals over a network (no matter how fast your network is). If Nvidia opens up the driver (unlikely) to allow VM assignment to the GPU outputs that would drastically improve performance.. BUT Hyper-V would no longer work for that as it works differently than other hypervisors. You cannot assign specific hardware/ports to a VM only allocate their resources. Though it's possible that is the reason this works at all.
In a shared GPU environment like this, the host always takes precedence, thus if the host requires virtualization, it will be done first then the vm's.
One of the biggest introductions of latency is queue submission and having to re-instantiate parts of the pipeline. Now I haven't investigated how para-virtualization works but at least each game submitting their own queue submissions is entirely unavoidable. This is because these submissions are specific to the geometry being rendered, which obviously is different for each game. This is why dividing a video card across users will never have good scaling and you will always see a bigger reduction than half the framerate. There is major other factors to such as much more cache invalidations because the geometry in between frames is different,as well as more memory contention due to memory access patterns being now less optimally scattered from the different workloads.
@@devintolliver92 GPU tech is advancing by keeping or even extending the rendering pipeline, absolute performance is going to increase over time but relative performance will stay stagnant
Does this mean that if you wanted to run this setup for say 2 users, you'll have to contend for a bit less than half gaming performance for both instances, instead of exactly half, just to account for the overhead of this workload not being as predictable and easy to calculate as normal gaming? In that case I'd love to see how it fares with a benchmark running at the same time on both instances once setup properly, just to find out roughly how much overhead is expected for each configuration.
I think it's worth mentioning that some anticheat software (Vanguard, etc.) rejects running the game they are bundled with in a VM. Maybe there are workarounds for the particular game, maybe not, but don't expect to be able to play every AAA competitive game with this setup.
@@MikkoRantalainen not all AAA games are played competitive, but I get your answer is directed at Alejandro's "but don't expect to be able to play every AAA _competitive_ game with this setup." For just evening match or 2 in games that are really restrictive with anticheat, then most likely you already have dedicated machine and no need to share GPU. The GPU sharing seems more of use in maybe setting up as some sort of couch machine/home media machine or for someone else than yourself in same household (to lower latency) to play something together.
Those anti-cheats also have a tendency to reject Parsec's input, or even sometimes it's screen capture, most of the time. Even without being in a VM, Vanguard is not playable remotely
@@krisavi With "AAA competitive game" I was thinking on games that value competitive integrity in a way that makes them resort to this kind of anticheat software. An "AAA competitive game" doesn't necessarily have to be a first-person shooter, where reaction times are paramount, or similar fast-paced games, but yeah, both of you are right in that such a setup won't offer the best experience on such games even if it worked. What I think it is more important is that games with this kind of anticheat won't run at all. You can't even choose to play private, custom matches with your friends to have a good time, where some extra latency doesn't really matter.
I think the idea behind the laptop is good, but the build quality needs some work, especially the hinge. Check out how much the screen shakes compared to the other laptops.
I was doing academic work in the virtualization high ability field circa 2011... For years there was a flopping between hardware and para in terms of what provides better performance and what will be the future. At one point around 2015 it seemed hardware will be it for good. I personally always felt para was a more elegant and cooperative design, though back then not available for Windows.
I doubt you would see a difference in performance on Windows Server, but you would gain the ability to directly map hardware to a VM…which granted, isn’t really the goal here.
I'd think that the host OS being Windows server might help the VM performance, but then I'd expect compatibility issues playing games directly on the host.
@@brychaus9059 you're probably right about host gaming issues, but for a home-server use case, the three (or more maybe?) VMs would be enough for the home. Who has more than 2 friends anyways lol?
@@christopherprevost763 Yeah I see what you mean, but the host got so much better performance in LTT's testing that I would want to use the host to take advantage of the performance.
@@brychaus9059 Host will always have the least overhead for hyper-v since guests have the hypervisor as additional overhead. I'm going to give this a shot over the next week or so with one VM as a proof-of-concept for my home setup.
This is actually a great option for trying out windows 11 from a windows 10 gaming PC. Parsec makes it a much more native feeling experience than hyper-v alone.
It should be noted that having these instances virtualized means you will loose some performance due to the overhead involved in virtualization. How much of an overhead depends on things such as how efficient the hypervisor is.
The sped up, step-by-step, screen cap of the vm being spun up in windows, is actually very genius, and should be a standard for content like this-- that's technical and takes many steps-- maybe even queued up a bit earlier during talking points. It's almost like the first time I had seen a teardown on GN where the unscrewing etc was chopped up and played back normal speed. Kudos to the editor of this vid.
“You’ll need a windows license for each VM” well, not exactly, the host computer needs 1 Enterprise license and each client must also run Enterprise . Professional doesn’t cover delivering a remote windows desktop…and oddly, MS cares how many computers run the desktop, not how many instances of Windows are running.
@@jammiewins sure, that was my point. Don’t license the VMs, doing so doesn’t actually make it valid. Either way you are just trying to fly under the radar.
@@finnderp9977 correct, and in that scenario you basically have two licensing options. Licensing named users, or licensing devices. And contrary to what is logical, licensing the device means licensing the client device, not the server or the VMs it is running.
@@finnderp9977 All you need is a valid license for the operating system you're using as guest. Windows 10 has a limit of 32 virtual processors allowed.
"You should get 4 different Windows licenses. Definitely don't just spin up a new one when your license expires." *bites lip nervously* Thanks for telling me what NOT to do Linus, I will definitely NOT do this if I try this build.
Every time I hear people say activate your windows, I laugh at all the shop computers and test benches running the same non activated version of windows 10 pro that I did in a single night on like 5 towers 🙃
microsoft doesn't deserve your money 1 time, let alone 4. For an inferior OS filled with adware, they don't deserve $100. Linux is free and the only disadvantage is lack of third party support, which microsoft have done nothing to deserve.
@@itdepends604 to be fair, Windows licenses are practically free with any prebuilt computer... I wish they'd just move to a freemium model and let me disable ads, automatic updates, etc on a pro license. Linux gives you choice but the average consumer isn't great at dealing with tech issues by themselves.
@@DelphinusMAch1 windows licenses are NOT free with prebuilts. While OEM's obviously get bulk discounts, it still adds around an extra $50 to the cost of the computer. Almost all of the problems with linux are software support. Apart from this, linux is actually more stable then windows. People are obviously going to find it harder to sort linux issues if they're used to windows, but linux issues generally aren't any trickier to solve then windows issues. So I completely stand by what I said. I am not paying a useless company to use my already payed for software, when they are already getting payed by their adware anyway. My VM's will stick with unregistered windows.
I've been doing this for years on linux, using the DRI subsystem to shard a single GPU between multiple users on multiple seats. Works best if you either use integrated graphics or toss an ultra-low-end graphics card in to get more display outputs. You *can* drive multiple window systems off a single GPU, but it gets a bit unreliable compared to one-gpu-per-seat. You can usually get suitable GPUs for about $15 each. And without the overhead of a full VM stack for each copy, the performance hit is much less.
Is there anywhere I could read about this? If I understand right, your main GPU supplies the processing power, but the extra, crappy GPU supplies an output port, yes?
@@prydzen this is actually how this project that linus featured works. Basic Display Driver/Hyper-V Display for, display and for Graphics its your GPU.
I would love to see how this would run on and AMD RX 6k 16GB series card cause i get the TI is powerful but i want to see the power that AMD can output on the similar work load
16:23 "Windows License For Each VM" This is where gaming on Linux can really take off. Spooling up 4-8 instances of a game with just one box could bring LAN parties back to mainstream.
Amazing timing for this video. My GF and I have just started to play some games side by side in the evening. I‘m playing on an R7 5800X + 3080 Ti tower, she‘s using a Surface Pro 4. I was almost thinking about getting some used hardware for a few hundred to get her a better experience. Now I can just share my system. I’ll definitely try with my gaming buddy later on, he’s rocking a GTX 560 Ti… Thanks so much!
@@marcogenovesi8570 From a quick google, the passthrough functionality works on AMD too (and they never blocked it off). But it might need to be done in another way than in this video.
CS:GO was designed using an older API, DX9, so it's harder for the 3080 Ti to run than it would be if it was designed to use DX12. This is a common issue with modern hardware running older games. Nvidia and AMD optimize modern GPU hardware to run modern APIs.
I've seen forum posts asking why ignore Aster, a response iirc was 'windows could potentially patch that out' yet here we are on the cusp of W10 going EOL. I wonder why Microsoft doesn't want to support native multiseat, their licencing keys are under 10 dollars constantly on sale, and The Linux Challenge overlooked multiseat too. Pls Anthony, show them how linux does multiseat + proton.
Bit Aster seems to have issues running multiple of the same instances at once where it is not allowed, like it is often the case with games. Or am I wrong?
@@lugaidster ah thanks. Tried Sandboxie before and it didn't work for the games I wanted to play, so I used the HyperV approach with GPU paravirtualision.
So they're steaming the signal to their laptops, but the processing is being done by the desktop in the background!? Holy crap that is awesome for lan gaming sessions!
I see what you did their….. ”steaming” 1 game to 2 laptops from a desktop. 😂 (I’m sure that was probably a typo, but it actually works out since they are indeed using steam….also totally not trying to hate or anything but I’d write “lan” = LAN . Since “Ian” looks exactly like my buddys name “ian” as capital i’s and lowercase L’s look identical.
That little bit of overhead is surprisingly small, and less impactful than I'd imagined. But yeah ima just save myself the money and throw another RX 580 in my rig when I decide I need solutions to problems that don't exist
@@waderyun.war00034 I may be wrong but i think hes talking about the imaginary problem being nvidia's lock on splitting gpu, not "1 pc multiple gamers"
I've been researching this on and off for forever. This was the main thing keeping me from switching to a fully virtualized setup. VFIO time, maybe! Time to actually watch the rest of the video to see if it's practical.
You can do that with Looking Glass or VFIO is also an option. SR-IOV(the splitting of gpu) is somewhat of a hack right now, unfortunately nvidia are still buttholes about it. But it you can make it work and even share basically hardware connection to it with Looking Glass. Or you can do VFIO passthrough if you have a few gpus laying around(lol)
I'm guessing the 3080 Ti is splitting the memory into 4x3GB. 3GB is enough to run Doom Eternal and Halo Infinite without vram limitations but 3 is a minimum, there just might be a purpose for the 3090 having 24GB after all.
It would be interesting if instead of splitting the VRAM up evenly if it could detect if multiple clients had the same game open and have a shared memory pool so they could share the vram.
This is an intriguing idea for my wife and I (both architects WFH) to have one workstation two architects. She’s remoting into her office and I work locally so this really could work even with a 1060 & 8700k.
I'm curious - could there be a 'comparison' in cost between having four individual systems (with similar performance) to having this sort of setup with four virtual machines? I mean, is it actually cost-effective to run one powerful machine over four not-so-powerful ones? What about the cost to run the systems (power draw, etc)?
From some quick math and assumptions, assuming you get the GPU's at MSRP and buy all new parts, the expensive system (3080 Ti, 5950x, 32GB RAM) and 4x cheap systems (GTX 1650, i3 10100, 16GB RAM) would be within a couple hundred dollars of each other in either direction depending on exact parts chosen. If we swap the 3080 Ti for a 3080 however and go off MSRP the expensive system would be much cheaper and I don't expect performance would change too much. As for power consumption they probably wouldn't be terribly off from each other but this is really hard to estimate without actually building these systems and getting exact power numbers (not enough to just add max power limits of the parts together as they probably won't push to the complete max and PSU chosen makes a difference). Of course there's multiple ways to go about this, for the cheap system I chose parts that made sense to me but someone else could choose cheaper or more expensive parts. If you go used the story is completely different and 4x cheap systems would most likely come out way cheaper than a single expensive system but power consumption would probably favor the expensive system in that scenario. If you have any questions about any specifics on the systems or rationale feel free to ask
Way cheaper if you don't all game at the same time. Have two people game during the day and sleep during the night Have two people game during the night and sleep during the day 2x performance for cheaper price.
@@taylervest6594 depending on how many, that might require a much beefier host, in terms of available PCIe lanes more than anything. I've run into this problem with my 3900x based server, add two GPUs, 10gb network and a disk controller there's not enough lanes. I'm upgrading to first gen Epyc for this reason (I can also consolidate some of my other gear), 128 lanes all the way.
The idea of running multiple game instances on one machine reminds me of the days of split screening locally on older game consoles as a kid. Would be a killer config for LAN parties.
This really makes for some nice use cases. I have two kids, and using parsec on a pair of raspberry pi’s to play some local games with them sounds amazing.
Be aware if you're using other VMs already. When enabling Hyper-V, other VMs like VirtualBox will not work anymore. So if you use the same PC for work and fun don't panic on monday morning
Windows 11 comes with Windows Hypervisor Platform which enables VMWare Workstation to coexist with Hyper-V w/o disabling one of them beforehand. IDK if VirtualBox supports it.
@@Skyline_NTR I ran into this issue when I activated Hyper-V on Win 10 in late 2021. Maybe if it's supported, at least some additional magic is required to make it work. I did not dive too deep into this issue because it's my productive system and I did not want to mess any further.
Technically wouldn't there be three instances of the pc running- like the original and then the two virtualizations- maybe it is holding back a little in case the original instance needs juice? I honestly do not know anything.
It helps to know that Parsec has quite a heavy GPU overhead in addition to what ever the game requires. That said, I LOVE parsec and use it to game on my main PC on my old laptop. It’s a great gaming experience. I’ve also used a dummy hdmi and sent my physical output to big screen, while remotely connecting to the secondary monitor. That was my karaoke setup so big screen has video output, and my laptop is great for the control software.
Hey Linus & your crew! How is this compared to unraid? Any performance difference? Also possible to locally access the stream (got get rid of latency) and for example have one PC for an isolated Gaming installation and an Isolated work installation? Very interesting topic 👍🏻
So, since Nvidia prevents partitioning on consumer cards, would this be theoretically easier on a 6900? And would that work for splitting a hardware port per virtualization?
The GPU isn't being split in a traditional manner, Windows is doing all the work. So long as the drivers are set up right, either one works. I recommend Nvidia though, since it's encoder is higher quality and lower latency, making it better for Parsec
the amd card? yeah i looked into this because i want a home virt server and sadly instead of the hardware being locked on amd it’s just not there, they have had vgpu gpus but they cost way to much for a card with crappy performance
Talking about how Nvidia locks down this functionality while ignoring how this might work on an AMD GPU is a great way to give credit to the "LMG is owned by Nvidia" people
GeForce Now means very little to Nvidia’s balance sheets. Quadro card and workstation sales on the other hand are huge. Anybody doing anything related to machine learning and AI probably runs CUDA on an Nvidia GPU, and they’d rather we didn’t buy the much cheaper GeForce cards to run all our compute instances.
@@benjaminoechsli1941 GeForce NOW is and always was free... it's just that you have to wait in the queue for 10-20 minutes to get 1 hour of playtime. Paid plan skips queue, gets you 6h sessions, 120fps and RTX. However, if you want to play singleplayer games you don't have to wait with free subscription. When I had a slow computer I used to play HITMAN there, with a free subscription. Never had to wait. Multiplayer games had queues, though.
@@mediumplayer1 Correct, but the point of this video is multiplayer 100-200+ fps RTX gaming, which would require a paid subscription if you don't want queue times.
Nvidia has quietly removed some of the concurrent video encoding limitations from its consumer graphics processing units, so they can now encode up to five simultaneous streams. So now you can have 6 players (one local + 5 VM). But I think you definitely have to have 24Gb VRAM for this :)
I would love to see technical videos like this, but for running more basic things on Linux. This takes me back to when I was learning how to use Windows as a kid.
I'm already doing this with Windows Remote desktop and RDP Wrapper, it needs a bit a tweaking in the windows groupe policies and registry but once it's set up it's working great, I've only tested with one extra user gaming at the same time as the host though
So I just got done trying to set this up, but was having issues with connecting any extra controllers, such as my Xbox controller and USB steering wheel. Have you found any good way to connect these through RDP, or anything?
@@paularie2202 for some reason I wasn’t able to get that to work. But what I was able to do (for anyone else that reads this) was start a RDP session and then use Parsec to connect my peripherals. This allowed for a seamless connection, but also made sure that I could still use my host computer for other activities at the same time.
Could cs go have been memory bound? You have a great GPU and CPU, could it be the ram speed being overwelmed? Could you try a cpu with quad channel mode?
That's a nice way to do this stuff, props to Parsec! I used to do it with Aster, it's very light on resources and disk (Because everyone is using only 1 windows and the same installed programs but everyone with a different user) and it does not have this lag/frame drops/compression as it is all run on host, no vms, the only downside afaik of Aster is that only 1 person can run Steam 100%, because to run 2 or more steams with this method you need to use Sandboxie and some games(VAC/EAC) won't work inside sandboxie but evething else works, ubi, origin, riot games.. Guess I'll try this Parsec mode to run Steam games that have this problem.
@@sntg_p It's really that good. My cousin lives in a house near mine and we have a lan connection between them, I use a dummy plug in my GPU and he uses SteamLink app in his Smart TV and we play Payday 2 really smooth even with average hardware (1070, 2700x, 16gb ram)
And it's all auto, turn on PC it autologs both accounts and opens Steam at his side to connect with SteamLink. If you're going to try this method remember to setup the CPU cores at Aster because if you let Windows manage it some games will try to use 0, 1, 2 at same time and leaving the rest idle.
Frankly i'm more interested in the para virtualization than I am in anything else. My understanding is that para virtualization amounts to virtual machines sharing a single copy of the windows kernel rather than each VM having its own copy of the windows kernel. Which makes the para virtual machine overhead smaller then the overhead cost of having each virtual machine run its own copy of windows. (As I said "my understanding", which is hazy, and I could be wrong) Does the para virtualization only work on Windows 11? Can I run the para virtualization software on Windows 10 if I don't care about sharing a GPU?
This is an awesome way to manage such sad times with GPU "shortages". My concern would be if Nvidia sees this as a thing and limits or nukes it's possibility all together then push it to newer GPUs as a "feature" just so it can be -further- up sold.
@@ikkuranus oh I didn't see that, okay maybe it does work in windows 10, I swore it didn't use to as this was a big controversial thing when I first heard about Nvidia disabling gpu splitting on the lowerend models
Here is almost certainly why the scaling isn't perfect: context switch. When you only have a single process on the CPU, the process runs at rate 100. When there are two processes, there is now a non-zero context switch overhead, so the rate becomes (100 / 2 - context switch overhead) < 50. Same story for GPU.
This, but I think it's more because the video output is still being rendered and outputted by the GPU. So the GPU is rendering/outputting every other frame (one for each VM). Then you've got the context switching between the two VMs on top reducing total framerate
Image when Microsoft allows GUI apps in Windows containers... it would be a dream! Edit: Windows containers CAN run GUI apps, it CAN'T however use RDP.
Now that I am thinking, could a dummy HDMI plug solve the RDP problem. Another thing for those who want to understand the benefits og containers over VMs: - Lightweight - No need to partition the GPU - Run many instances of Windows and reuse the Kernel (it is kinda like running light Windows) - Small (really small) performance loss
@@shadowtheimpure But I am not really talking about Parsec, I am talking about using Windows containers instead of VMs for this setup. In fact, you can use Parsec with containers or share your host display (on Linux containers at least)
You guys should make a video about the new 3D printed OLED. A research group of scientists at the University of Minnesota Twin Cities has succeeded in 3D-printing the entirety of a flexible organic light-emitting diode (OLED) display for the first time. I know it's only 1.5" and only 64 pixels. But still looks promising.
@@ezicarus8216 So what? Knowing that they can already do it should be interesting enough. Plus, things can scale way faster today. Specially if we already have the tech, just making it cheaper. Just like they are already using sodium-ion on batteries. It's way faster now.
@@ezicarus8216 Sorry, did I ever said about mass production (tech being ready =/= viable on the market)? I did said about things that can go faster now days. Even the batteries are about to get on the market, only because now they look like a good option. We had solar energy while we used coal, but the latter was cheaper and easier. Back to 3D Oled printing... stil impressive and can be a cheaper option. Never said that IT WILL be. You just went way too emotional about this, buddy. Chill = )
@@ezicarus8216 Sure, pal. I just said that this looks interesting and would be a nice option for cheaper screens. Learn to control your emotions, big guy. Have a good one.
@@ezicarus8216 should =/= demand. Not a promotion, just an idea. Linus have some videos about "old tech" being crafted in a curious/cool way. You are way over your emotions. I feel sorry for how triggered you are over a simple idea for a video. Please, do yourself a favor and play outside more often. Cheers!
@@ezicarus8216 Oh boy. I think it's better if you get some help. I meant it. Been here on LTT since 2009. You are the first salty, childsh, spoiled, triggered boy I have had the displeasure to find. You did it... you are that one guy nobody likes. Congrats! I really hope you find the help you need, hit the books (you desperately need some interpretation), go outside and grow up to be a better non trashy human being. The year just started, you still can make this your resolution. I wil be reporting you and blocking your profile. Again, have a good one. = )
did you really never hear about ASTER ? its a very easy to use Multi Seat software - you can have 1 PC - 2 Keyboards - 2 Mouse - 2 Screens - 2 Windows Users - everthing can be used independtly from each other - plus you can share the display ports without any issues or VM install
@@Lmpy since you can or even have to use Aster with two seperate windows accounts, that should not be an issue - but of course both windows users need to own the same game in order to play together - I used aster with my girlfriend and i was so impressed by it - we had ONE pc but TWO keyboards, TWO mouse, TWO Screens, TWO headsets, i even could have set it up that both of us would have 2 screens each so 4 in total
Hi thank you for your great video. Does OpenGL work for you? I can’t get it working with Davinci Resolve with 3070ti and 3080ti. Hyper-v on Windows 11 machines.
@@TitanBoreal Moonlight should be able to do something similar. It's an open source implementation of Nvidia's game streaming tech. I was using it wirelessly on the opposite end of my house to play games on my jank laptop before.
@@TitanBoreal my team is actually working on a parsec competitor focused on general consumer use where each window is seperate rather than the entire desktop, will be a few more months until we get it out but is planned to be free for personal use.
LAN GAMING! we can go back to the days of LAN gaming, bring your friends over with there laptop and all hardwired in. This is space saving for the best LAN parties!
GPU-P was also introduced in Windows 10 20H1 I think so it’s technically possible since them, I hope in next OS updates of Windows 10 and 11 they add a GUI configuration for GPU-P so anybody can set it up by themselves without even needing to watch and store tutorials anytime they want to create one
I’m interested to know how this works for VR. My wife and I both have a Quest 2 and I stream my games wirelessly with Virtual Desktop but she just plays games natively on the Quest. It would be good to both be able to play PCVR together without having to buy a new PC with GPU’s you can’t get anywhere.
@@I_am_Strap Makes me want to try it even just with my 3060ti. Maybe 90fps per quest is asking for too much so 72fps would be ideal. Much better hardware would be pretty neat if you had the spec for 90fps each and well depends on the games too
Any idea if this would work on AMD video cards, or is it nvidia exclusive? It could be interesting to have this as an option for when friends want to play games when they're over but don't have their computers or something of that nature as well.
It's possible to do with any kind of graphics, AND without configuring VM's at all using software called Nucleus Co-op, which sadly has no Linux release as the devs don't see enough interest atm.
it looks so funny when all the guys at the background (except for the one on the tv) just froze and looked at the laptop while linus was talking for the video at the end
Parsec provides some statistics on this Typical values for latency is: Decoding: 3-5ms, Encoding: 2-5ms Network: 1ms So a total of around 10ms just for running parsec - surely theres some more latency and i wouldnt do it as a comp. player but most games run fine
Now try to do the same on linux with wayland (or x11) seats (basically a bundle of input devices that get their own focus/mousepointer etc). Not sure what to do about starting steam multiple times, but probably just using multiple instances of the steam flatpak would work. That way should actually be able to use native outputs, also please use a amd graphics card if you test this. :D
This is actually incredible. I remember trying to replicate the 2 Gamers 1 CPU build you did ages ago while in high school. And now I can try and do the same thing again but with my current build and only one GPU. Also that wasnt hacks, its a very common spot to just spray and they got lucky. Pain.
"Anno 1800 not the most demanding game" Yeah Linus sounds like you have a healthy work life balance because that game gets HUNGRY in big saves when you are running skyscrapers and thousands of investors