apple notorious for dicking people in the GPU department while giving you a great panel the Rx570 is like having a 1060 3gb if this was normal ultra wide i would say its fine for 60Hz but the extra pixels will make it struggle.
jonathon rosalia and it’s clocked lower than normal 570 due to thin body. But even in bootcamp, obviously games can’t run anywhere near the native 5k resolution. It’s lucky to do 1080p60. But panel has great colour accuracy etc
@@Supcharged exactly great panel terrible gpu thanks apple i dont know what the people at apple are thinking anymore i personaly juat want them to make a standard ITX and ATX motherboard with their ssd soldered into it or on a M.2 and let us build the rest of the computer since thay have no clue what they are doing. i love mac os but hate everything its installed on at this point for one reason or another.
This is ultrawide 4k. Ultrawide 1080p is 2560x1080, ultrawide 1440p is 3840x1440 and 5120x2160 is ultrawide 4k. We say 1080p not 1920p. The vertical pixel count is what counts.
But 4k is called 4k cause of the almost 4,000 horizontal pixels. So 5,000 horizontal pixels would be 5k. IF you called: 4k 2160p then you could call 5k: ultrawide 2160p
I actually do this. I bought it for my own computer, but my work laptop is 3 years old and just has whatever intel graphics were middle of the road at the time. It works more or less fine.
Not to mention the GTX1080 or RTX2060 you'll want to run it. It's the same for me, I recommend you do what I did and move to an ultrawide 1080p, it'll require alot less power to enjoy whilst giving you a modern IPS 75 FPS with a FOV that'll have you saying.. never going back.
The other option is to just wait a year and buy a 3080ti (will probably be 1500$ as nvidia has no competition in the high end segment). I sold my older 2x970 cards as SLI support is poor nowadays. The main reason for that is because the most popular anti aliasing algorithm TAA used in most games today isn't really compatible with multi-GPU as it needs the result of the previous frame to work on (it is a temporal algorithm).
@@gencoserpen1260 you think it will take them just one year to release another generation? It took them 2 to release Turing. Maybe higher prices aren't the only thing they will keep doing.
@@leobitencourt4719 remember Turing was released last year though so 2 years will have passed by mid 2020. And yes they will have to release the new generation as AMD will also be releasing their higher end Navi GPUs in 2020 (this year they'll release an RTX 2070 equivalent mid range card). Intel will also release their own graphics card in 2020, we don't know much about it for now. Nvidia has a comfortable lead over AMD so they don't need sweeping architectural changes, just shrinking their node to 7nm will be enough for them to maintain the lead and status quo in high end GPUs.
I just got this monitor and I was disappointed with the color calibration out of the box as well, until I set it to custom 1 setting. The factory calibration is apparently on custom 1 as it looks much better. The only thing that is disappointing and not covered here is the slight green hue you get from viewing the screen at even slight off angles. I mean even your eyes being an inch lower or higher. It is only really visible on the whitest of white backgrounds, but still is enough to kill this as an option for pure creative work.
The technically correct term is "Ultrawide Ultra HD". I try so steer away from terms the average consumer prefers to use like 4k or 5k as it does not specify aspect ratio. Although it isn't wrong to say this is a 5k monitor it is misleading as people kind of expect 5120x2880 when you say that. It is something a salesperson would say.
I know it is technically 5k wide, but full 5k itself is very different. I say it should be called 4k ultrawide because it would be much clearer to the average consumer, who doesn't really understand what it means but just knows that 4k is great and probably thinks that 5k must be a big upgrade, when in reality this isn't an upgrade outside of aspect ratio. Yeah it's higher resolution, but the density is no different than 4k.
"k" as indicator for resolution is crap anyway. There is a perfectly fine measure for this already used in Digital Cameras: Megapixels. 1080p 16:9 -> ~2MP 1080p 21:9 -> ~ 2.7MP "2K" 1440p 16:9 -> 3.6MP "4K" ~2160p 16:9 -> ~8.4MP "5K ultrawide" 2160p 21:9 -> 11MP "8K" 4320p 16:9-> 33MP
My old LG 34UM88C-P has been a fantastic monitor, and only got it for $500 in early 2016 from an Open Box special. Overclocked to 80hz and 10-bit depth with freesync. Once you go Ultrawide, its hard to go back.
I´m watching this video in 4K on my LG 32" 4K monitor, and you can see the pixel density of this beast. I know this monitor is meant for creators and they aprecciate straight lines and all of that, but they should have also made a curved model, at 34" ultrawide you find yourself with some distortion on the corners, a decent curvature would have fixed that.
I find myself wondering what you think distortion means. Usually it refers to geometric distortion - where curved monitors get you cylindrical distortion all over. Then again, you only have correct perspective in one point, and people are rarely at that point (particularly as we have two eyes), so undistorted views are often limited to VR setups.
One thing to note is that most people have ambient lighting between 3000-5000k, and the human eye is adaptive to white point. The sRGB & Rec709 modes coming in at around 5600k might be on purpose, a kind of stop-gap for people who don't calibrate their monitors -- if it was true 6500k, for the average person, it'd look (way) too cool relative to ambience and throw their senses off. Meanwhile, people who know about this stuff can recalibrate the white point to whatever they want.
I really struggle to understand why would LG refuse to add the gaming elements like FreeSync 2, LFC and something above 60Hz (since it seems that expecting every monitor to support 120/144Hz is still too much to ask even for 2019!). My conclusion is that LG wants to artificially prevent the comoditisation of this monitor from gaming enthusiasts buying in volume, which would then lead to price discounts by retailers. The monitor cartel refuses to give consumers the monitors people want, keeping the good monitors at ridiculously high prices, and no one seems to be the least bothered by this cartelisation.
Ports should not be on monitors like that. Accidentally yank down on the cable behind the desk and rip the port off rather than simply having the cable unplug. Ports should face down only IMO.
How the hell would you accidentally pull any cable out on the back of your monitor? Behind your desk? It’s not exactly an accessible area for most people
What an absolutely FANTASTIC review that was. Thank you.
5 лет назад
I guess i could run a custom res on my Dell 5k monitor for ultrawide, will look pretty damn small though. Until it’s this res with 120hz, proper hdr and sync tech i’ll stick with 16:9 4k for gaming.
I would pick up that new gaming 2k 100hz superultrawide from LG over a 4k 16:9 any day. Resolution is overated compared to wideness. But that's my opinion as a person who sinks way too much money in wide shit
Your reasoning for not getting one is identical to mine. I also game on my computer and I don't have the money to blow on a 2080 Ti or Titan RTX. This thing is *perfect* for me and the only ultrawide I'd ever buy (partly because of the lack of a curve).
I've been looking to get a new monitor and this 5K2K monitor struck my interest. I was looking into ultrawides due to the fact that I play a lot of singleplayer RPG's as well as strategy games and action adventure games, stuff that isn't as dependent on framerates, but does benefit more heavily from higher resolution and larger aspect ratio. For anything that is more dependent on framerates and response times, I can always get a second 16:9 1440p144/240 monitor or a 4K144.
Nice review! Can I run two of these monitors with a 2080ti + i9 9900k ? mostly for productivity, just lots of graphs and charts updating several times a second.
I'm not ready to jump to that pixel count yet. You could always game at 2560x1080 on it and maintain native pixel alignment. I know it's lower resolution, but it would make it more accessible for some creators. I run my 4k Momentum in 16x9 for movies and some stuff, but I I run 3840x1600 & 3840x1620 (some games work with only one), because I sit closer when I game, and it makes a nice ultra wide. It's still about 40" in UW. Just throwing out ideas.
I think its a fair review...needs to have a higher resolution so that it wouldn't be regarding as a single-use monitor for those of us who's desktop in an All-in-One computer!
Would you see the difference between a 32"4k and a 32"1440p resolution? Don't really want to kill my fps, maybe I could stick with a 2k monitor. Actually see pixels on a 32"1080p
Picked this monitor up recently... experienced nothing but terrible image retention/ghosting. Ended up returning. Anyone else have this problem? Miight pick up another one just to see if it's any better - but for a $1500+ monitor I'd expect performance to be flawless. Admittedly, the picture quality is pretty amazing, but working in any program where areas of the screen remain static for more than 5-10 mins.. was definitely an issue for me. (Brightness set to about 60% to help combat this from happening) The only other time I've experienced a problem this bad was when my iMac screen was diagnosed as defective after nearly 3 years of very heavy usage.
It's not a 4k ultrawide either because it has 5k horizontal pixels. Calling ultrawides by their horizontal resolution is dumb because they are misleading
@@robertposteschild2353 technically it's supposed to be a 2160p ultrawide, but good luck getting any company to go with a number that the average customer can't easily recognise lol
Damn, nearly got a heartattack reading the title... read it wrong. I have a 34GK950F and a 2080TI. All good on my end :) Enjoying this display, its really amazing to play 144Hz on a 21:9 display
please review the insignia ns-pmg278 . its not new by any means or probably the flashiest monitor ever, but I'm curious of your thoughts considering the price (at very least in the us; hopefully you guys can get it for similar). it would be cool to see a 'budget roundup" for high refresh 1440p monitors for those competitive yet budget minded gamers as well as those that do some.content creation
I wanna know what it would be like to have a "partially curved" monitor ... So like the 16x9 bit in the middle is flat and only the rest of the 21x9 on both edges would be curved. Id like to see how it would perform in day to day usage
Man, that's a lot of pixels to push, not to mention I usually stick to the $150-$300 range gpu's. For the foreseeable future, I will definitely stick with my 3440 ultrawide. While the 1070 does a decent job of pushing it up to its 100hz refresh rate, it is only because I play less demanding games and usually lower settings if it needs it. My plan is to upgrade to the next generation of gpu (skipping the first rtx stuff) and keep the monitor until that gpu struggles. For now, I'm very happy with 100hz freesyng 3440 ultrawide.
Considering that I can barely push 120 fps on any recent games at 3440x1440 with my 1080 I’d be suprised if you pushed anything from the past 2 years to 100fps
I'm putting parts together for a content creation rig with specific reference to 4k Video editing (With the Adobe Suite). I have Ryzen 7 2700x, RTX 2060, 32GB RAM. I don't do any sort of gaming. can this combination power the LG 34WK95U.... with Love from Nigeria
I think it would work fine, but prepare for the live view on the timeline being a bit laggy. You may have to set it to half or quarter resolution for playback in there.
@@Fortespyproductions Your CPU is fine. I'd recommend a faster GPU, preferably at least a GTX 1080 ti or an RTX 2070. Make sure you have NVME storage too, that matters a lot with adobe software. I like the Samsung 970 Evo drives.
@@zacharyc6549 Thank you for your response. I will definitely use the Samsung 970 Evo NVMe drives. from reviews online, i just discovered Radeon VII from AMD to be very excellent in content creation... especially Premiere Pro. It supports 10 bit color too. unfortunately AMD does not indicate if it supports 5k resolution displays.
Ultra wide are great when having to have multi window work, I use two LG 2560x1080 75MHz Freesync. Looking to replace with 1440p ultra wide when $$$ is better.
So does it worth waiting for nano ips or that micro led monitors? I want to spend 600€ for a gaming monitor and i like a monitor with good colors. What should i do?
Excuse the disturbance but I really hope that you answer these questions of mine because I am super willing to buy one of these two LG 49WL95C or Samsung C49RG90 for ps4 and xbox one x monitors. I also own an imac 27 monitor and hope to connect it. I have PS4 PRO and XBOX ONE X..In the future I will also buy a powerful PC but for now I am willing to use it with my consoles but I have read that it is only for PC and for games in 32: 9. Okey I agree, but I saw in every video review on this Monitor that the PC games reviewed were The Division 2, Ghost Recon Breakpoint and some in first person. What it means? I'm not a PC expert, but it means that PCs have a graphics card that transforms The Division 2 into 32: 9 for Samung LC49RG94SSUXZG OR LG LG49wl95c ????? Or that UBISFOT has created The Division 2 in 32: 9 ONLY FOR PC POSSESSORS? Sounds too weird can you help me? Because I found a person on youtube who tested it for months on the consoles and in the end as you can see from these links by removing some options and inserting others on ps4 and xbox one x and managed more or less to make it I don't say perfect but good. Here are the links ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-j7Ha5Q8Klm8.html ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-PKblwRMIe0g.html ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-HVQaX99TzYQ.html ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-eRALoICAaNc.html ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-flmbfc1OJxU.html Can anyone suggest another monitor like LG Dell Or others if I'm wrong? Always in 32: 9 because it is a format that I love. I have also seen that it could be reduced a little from the console to narrow the view and not even this as an option. My question always remains the same, The division is in 32: 9? Does it transform thanks to the PC card? Or is it Ubisoft that created it in 32: 9 as an option for PC owners? Thank you so much for your time and I apologize to the PC owners because I have the consoles and maybe they won't answer me but one day I will also have a PC. I'm using Google translation to put it in his language ...
@@joyboy6535 i dont have a mac, im a big boy pc gamer, and i intend to game on it like a motherfucker. Just waiting for the 6800XT or 6900XT or 3080Ti. im in a move currently so i cant test yet, but i found the same post about tbe burn in. ive googled more and i found that a lot of ppl say that IPS can not burn in but that you have a some sort of burn in effect that last maximum 5 minutes. I dont think it will be an issue for me. i bought the screen second hand for 575€
you do know that FHD is FHD not because of the 1920, but because of the 1080. And WQHD is 1440. And 4k is 2160. 5k would be 2880. I thought you'd know that x.x
I read these dont work on windows is that true i have a display port cord already and the video i seen from arkatect the comments said 5k doesn't work but that was with a old model do these new models have that same problem
@@Hardwareunboxed Bruh, What's going on, It's been an Entire DAY since U said it would be up & It still isn't up. What exactly is going on here? Did U just Forget or is there Something that's come up?
You said it correct, it's a 4k ultrawide. Thank you. 5k horizontal x 4k vertical 5k 16:9 is 5120x2880 and that's a whole 1440p monitor of extra pixels then a so called 5k ultrawide ha. You think being called a 5k ultrawide id be at least a tiny 1440p res more then a 5k monitors res. Good vid. As always.
We should have way more 5K monitors. They're just milking it... Although, we also need nvidia to stop milking it and just give us cards that can run setups with 5K displays... But it looks like the future is 21:9 baby!
found out last week they have 5k screens which nobody talks on or shows benches of which tells me people must not know of 5k or else they would be using that I use 5k on pcsx2 and it looks real good but I cant use 5k dlss for pc games cause I dont like the artifacts it has which is why I'll be getting a 5k screen I had 1440 for 4 years now and have never been impressed with 4k cause the jump is so small in games which is why I never upgraded to it but as I said using 5k in emu shows 1440 to 5k is like going from 720 to 1080 or 1080 to 1440 I suggest people getting a 5k screen if they can to see the big jump if their coming from a 1440 I dont know why a person would get a 1080 its way to outdated for pc when 1440 has been the go to for years now or they can get a 4k these have been cheap for years to if a person wants just better quality if they don't game and do productivity stuff get a 1440 or 4k or 5k if a person has 4k as he said the difference is small so the logical choice would be to go 8k to see that big jump