This advice may be unintentionally misleading. HDMI is fine and even desirable, but you do have to ensure that your monitor is set to expect a full range input, and your computer’s graphics card is also set to output full range. Because HDMI is also used for movies and videos on TV’s and your computer knows it’s connected via HDMI, its graphics card may be set to Limited range by default if you haven’t changed it. The HDMI cable itself has nothing to do with it… it just carries what the computer sends.
That's absolutely correct. HDMI has nothing to do with it. You can still have displayport connected and outputing limited range from the graphics card, or the monitor is set to limited range. Many people intentionally using limited range and calibrating to limited range to conform to broadcast standards.
And all of this is set by default for 99% of systems which is why we don’t hear about this color difference purported by this video. It’s also automatically detected on 99% and set accordingly
The origin of 16-235 come from the time where we are using Cathode Ray Tube (CRT). Clipping at 16 IRE was a trick to mask (cover) the retrace beam Clipping at 235 IRE was a trick to avoid buzzing in the analog audio. That limitation was useful long time ago when the image science was not digital.
Another thing to consider. A few years ago Linus Tech Tips bought a cable testing machine. You'd be surprised the amount of shady cables there is out there! So not only is there an issue with the input range but the cable itself might not be good. My suggestion is go for a well know brand and / or run away from cheap cables. Oh by the way that cable testing machine was a few thousand dollars so chances are you can't really test your cables. In any case, thanks for the headsup!
This is the correct answer. When our lab got rid of BenQ monitors the majority of our display issues went away. I'll listen to this guy when they can figure out how to make their monitors with components that live beyond 60 hours. There is literally zero within a cable that will limit a signal. It would have to purely come down to quality of material degrading the signal. Their guest is trying to claim that that is creating a defamation of the image quality. So my question is, how did the cable, if it's just a series of wires, alter the digital signal? His description of how he tested the two cables wasn't even remotely scientific. He just swap between two different types and didn't swap between different manufacturers. So so this whole video was a giant exercise in misinformation. But I bet he's earning some money off of it so I doubt he'll take it down after the top comment already flushed his entire video down the toilet.
@@dmn5384 "Just series of wires." My brother had problem with new monitor. It went randomly to black for few seconds and then image came back. I Googled problem, brother bought quality cable and it fixed problem. DisplayPort-cable has high frequency signal and if monitor and graphics card can't communicate clearly, then GPU will stop sending signal.
I think this video has many misconceptions, and I am really surprised to see a VP from ICC supporting this. HDMI as an interface (and the relevant cables) is perfectly capable carrying full range signal, as well above this, such as 10bit HDR. I can’t post links, but feel free to refer to the HDMI specification. There is nothing wrong with connecting through HDMI. There are two things that might be wrong, and both are user errors. The first is that your monitor is clipping the range to limited. Many monitors have the option for limited range in their settings, to conform to broadcast standards. Believe it or not there are people in the industry (video) which are having monitors calibrated to that range to conform with the standards in their broadcasting facility. That’s perfectly fine, and they know what they are doing. A user however that are not familiar with that concept might have the monitor on limited range which will produce unpredictable results (especially if you try to calibrate to full range). nVidia drivers There was a time in the past that Nvidia Drivers were defaulting to limited range, around 2019. Not sure why, it was really stupid. It’s not a problem anymore (at least I haven't encountered any issues these days). You can check what your Nvidia driver is outputting if you go to the Nvidia Control Panel -> Display -> Change Resolution -> Use Default Colour Settings, where Output Resolution is defaulting to full range. If not change it to full range if that’s what you desire. Now, as you can see two different things might be wrong, the monitor and/or your graphics card, so it has nothing to do with the connection (HDMI). You can still have displayport connected, and your driver can output limited range. Please don’t put information out there that is inaccurate. ICC also should try to focus and fix the issues on operating systems with ICC profiles. What a mess. No joke that people are calibrating by uploading native LUTs on monitors, which don’t work for photographers that want to embrace a screen to print workflow.
If you have an Nvidia graphics card you can change this setting in the Nvidia Control Panel under Display - Change Resolution - Output Dynamic Range, and set it to "Full"
Thank you mate ... yeah I've lesrned this from some other comments so it's grest to see this. However, I'm Mac and don't have that option with the M2 and I think for someone that doesn't know how to or is able then simply using a different cable is the best option. cheers
Nvidia have had it set to limited range in their drivers since 2007 or so. I still wonder why when in most of the cases a PC is connected to a monitor and not a TV. Full range should be default.
My 27-inch FHD monitor doesn't have a DP port, only an HDMI port. But my graphics card (GTX 1650) has a DP port. So will using a DP to HDMI cable still produce the same colour results as a DP to DP port/setup? I found many of these DP (source) to HDMI (output) cables on Amazon. Will buying one benefit me?
Somehow 5+ years ago and with prior Samsung monitors, I got my current Samsung monitor to perfectly match my prints. When I recently needed a new PC I opted for my first Mac and it allowed for those monitor settings so everything is still spot on. It's a Studio Max. The current and previous are HDMI connections. When I upgrade to a 32" 4k monitor I might stay with Samsung to be sure everything stays as is. I shoot and print everything Adobe98. I calibrated by eye and no devices.
Some people don’t have USB-C. Hdmi is great for connecting other devices like Xbox etc but also if you are able to then hdmi settings can be changed in the computer / graphics card
Thanks buddy for taking time to let people know about this issue... now sit back and enjoy all the specific questions and troubleshooting that folks are going to ask you in response. I personally am going to need you to come to my house and set up my monitor for me... :D
Very misleading video. HDMI may or may not default to Limited range but it s very easy to go to the video card’s display settings and change it to Full range. There is no physical limitation of a HDMI cable that will prevent it from displaying Full range. This video should of been helpful advice on how to check if your screen was set to Full dynamic range rather than this alarmist throw away your HDMI cables nonsense.
@@eartho Glyn's guest said that the issue was limited vs full range and many commenters pointed to the video card adjustment. Do you see a title to a video and look at nothing else? Much ado about nothing, Glyn can call the video anything he wishes. Most of us are glad he posted it.
Great video and great advice. I actually have been using my new very high definition monitor with HDMI thinking that the trade of is not that big of a deal, until I saw this video. Great explanation and makes sense. Just ordered a display cable and can't wait to see the difference. thank you for a valuable information and details on how and why it is important to use the proper cable for colors.
ok, before people go and buy another cable, after reading the comments and doing a bit of research it turns out there is nothing wrong with HDMI just be sure to go into your graphics card settings and make sure it is set to FULL RGB and not limited .
@@glyndewis haven't got a clue, I'm sure you didn't mean the video to be misleading, I love watching your videos, I don't think you stated in the video it was just for mac's? When I watched it I thought I might have to buy a different cable until I spent a bit if time researching and when I checked my gou it was already set to full rgb.
TL;DR: if you use HDMI cable you have to *verify that both source and the display are configured to use full range.* Due historical reasons, TV sets used limited range (the full range values were used for signal syncronization because TV sets didn't have dedicated wiring for hsync and vsync similar to computer monitors). And in the distant past of HDMI signal somebody had stupid idea to implement this limited range in digital domain. Note that *DisplayPort also supports limited range* so you should verify that setting even if you use DisplayPort. And the same applies to USB-C, too! Luckily the *default* setting is typically full range for DisplayPort and USB-C but you shouldn't assume that if you want correct colors. If you have both source and display set to limited range, your 8 bits per color panel turns roughly into 7.5 bits per color panel. Not terrible but not great either. If you mix the setting, you either get washed out colors or crushed blacks and whites.
What your guest said is a little bit incorrect. 16 - 235 is not cut off or limited range. It's known in the business as Video range, Head range, or SMPTE range (named for the standards body that created it) which was a way of sending a compressed analog signal (64-940, 10 bit) over the transmitter which would be uncompressed in the television. In the digital age we still hold onto SMPTE range and interlace signals as well as 29.97 and 23.976 for audio even though today's TVs are progressive scan and there is no color subcarrier to create a time problem for a 60hz system. The cable has nothing to do with whether your signal is SMPTE or computer Full range (0-1023 10 bit). You just have to know what mode you're working in for the monitor calibration to be correct. Computer screens are usually sRGB which is Rec709 color space, the same as ATSC HD. That said, if you want to manipulate your images it is probably best to capture raw and move the images from your camera onto your computer and import the raw files into your photo application. Raw has greater color depth than your monitor is capable of portraying. If you're making prints you may want to work in CMYK so that you're sure what you're printing in terms of color.
Nope. Most printers accept Adobe RGB because they feature more than just the standard CMYK inks, eg green, red, light, and/or light light blacks, and many other combinations. The printer will make the conversion to its own colour space by its self. Especially for higher end 5-12 ink systems
I have a question. In the Nvidia control panel options, when connected through an HDMI cable, there is an option called "dynamic range" which, according to the description, has a range of 0-255. My question is, isn't that option the equivalent of connecting a DisplayPort or USB-C cable?
***PLEASE NOTE: It is HDMI settings that this is referring to and NOT the cable. Some systems permit you to go into the Graphics Driver and change settings to Full Range. Mac does not have this ability so for that and for people unable or don’t know how , using a different input / connection from computer to display such as USB-C is best. Delta E of 2 or less when validating a calibrating would suggest Full Range is being used.
One important point: when owning a Nvidia GPU, they are always set to limited colour range over hdmi. You have to change it manually. Doesn't happen on AMD and Intel
Im going to have to speak up here because I think there's a real misinterpretation of information here. The cable isn't the issue (It makes absolutely no difference) it's making sure your hardware is set up correctly! 99% of monitors and GPUs are set correctly for the 0-255 range without needing user intervention which is why it's not spoken about. I use EIZO Coloredge monitors in my setup and it doesn't make the blindest difference if I use a Display port or HDMI. Last nite I even ran calibrations and looked at reports just to make sure I wasn't going a little goofy lol but neither makes any impact.
No-one said it was down to the cable ... using HDMI is what was said. If folks can and know how AND their systems aloow it then if they can change the range then no problem. My system and it seems many others are unable so the simplest and most reliable solution would be not to use HDMI which obviously means not using the HDMI cable.
@@glyndewis you might have to make a video showing casing this apparent difference because it’s not a thing I’ve ever heard any of! 99% of GPUs and computer monitors on the market output 0-255 by default which why this all sounds so strange. Have you physically changed your cables and seen these differences? If so it’s going to be a video to make for sure
I think something is not right with this video... If I have a full range RGB monitor and a graphics card that uses limited range by default, I just need to change the graphics card settings to appreciate the full range. I find it hard to believe that HDMI cables can't support full-range RGB, so this video simply leaves me with the impression that it's just an ad for benq.
Read the pinned comment … this we know. Also you CANNOT change settings in Mac so different cable needed for guaranteed full range. Not an ad for BenQ. Geez 🤷♂️
@@glyndewis Fair enough, then you shouldn't be warning photographers, you should be warning MAC users for having an inferior product otherwise your video is misleading mate. The fact that macs don't have this capability in 2023 is shocking.
Great information!n Since getting my newest monitor I've been using Display Port or a lightning cable with my Mac but I didn't know there could be an issue with using HDMI. It would be great to hear how one would know if this would be an issue on a particular machine/monitor set up. Though I doubt I'll be going back to an HDMI in the future.
I have several quibbles with what your guest claimed. If an HDMI cable has sufficient bandwidth to pass the intended signal, what is seen by your your monitor is a function of what was sent to it from the source (graphics card, stream, etc.), plus any EDID information at the sink (monitor/TV). One could use USB-C or Displpay Port cables instead, but neither influences the quality of the signal in any way. Again: assuming that the cable can pass sufficient signal bandwidth. Also, what your video monitor displays has nothing to do with the quality of your image at the printer end. Send a RGB-High, RGB-Low, or YCbCr signal to your monitor. That does not matter to your printer. Your source's application software, and your printer driver and printer firmware decide what is printed and how.
Exactly what has been answered many times in replies and comments already. It’s the settings not the cable. If your monitor isn’t displaying colour correctly then it DOES influence what is printed because what YOU see affects how you edit, so your point there is completely wrong.
@@glyndewis You seem unconvinced about your guest's misconceptions. Think about your system. Does your printer use the same technology to form its images as does your monitor? If not, there is a lot of room for mismatch. What you see is not necessarily what you get. Making edits? More errors will be included. We tried to get around this issue in the 1980s with Postscript and Display Postscript, but the CPUs were too slow to complete the work-flow in real time.
As long as your display is using the same data rate settings as your creative software is expecting to deliver to its all good. 8 bit/10 bit; Data/Video; full/limited. It is the mismatch in profile that is the issue. The cable is just one component that must align. Hdmi is equally capable of being used as any other cable to deliver correct colour - if the software in the display agrees with that on your computer. All you need to do is check your settings. The software you use and the context you work in determines which are the correct settings to use.
If the graphics card outputs 16-235 and the monitor expects 0-255 you will know, the colours will look washed out. Pretty simple to change this on windows, don`t know about mac. Anyway in windows it detects if it is an monitor or tv and this setting is set right by default, at lesat with amd/nvidia cards, i have both.
Mac its not possbile and with regards to nvidia, there are many comments form people sayijng they have now gone in and changed the settings ... so didn't happen by default
@@glyndewis Well, in my experience it always happened, anyway i think people wich calibrate their monitors should know about the full/limited story it`s kinda one of the very "basic" and important knowledges in my opinion, or maybe it`s just me with my editing/playing video history. Regards.
Not correct at all. HDMI doesn't limit your RGB, but your monitor might, or your graphics card may think it's connected to a TV. If your monitor does support full RGB, you can easily correct the error in the graphics card software to output full RGB.
@@glyndewis Would that be because they don't have an HDMI connection at all? What a ridiculous thing to say! Of course then you wouldn't use HDMI. But that wasn't the point of your video. You stated not to use HDMI because it cannot do full RGB, which is not true.
Integrated Graphics Card in M1 Mac and yes of course it has HDMI ... curious as to how you change the settings on the Mac ... Nvidia seems no problem but???
I worked in a monitor service for years. I don't want to rate the video, I'd rather ask. Is it possible to carry more bits than 8 on the HDMI cable? Is it conceivable that there is an HDMI standard that does not have reduced bandwidth as a default? Is it conceivable that the device communicates the displayable bandwidth with the display from the first moment? I note that there is also 8k HDR 120Hz (16bit/channel, 48bit) via an HDMI connector... HDMI2.1 127.75Gbit/s with DSC (Display Stream Compression)
I wanted to chime in to offer a related concern about BenQ connectivity: if you're using a Mac, the BenQ monitor has a serious issue with flickering. In fact "flickering", is not the right word (although it's the one BenQ uses). "Flickering" because that implies that the image strength is fluctuating. The actual problem is that the screen image actually goes entirely black for 1-2 seconds before re-appearing. And it's sporadic. Some days it won't do it at all. Other days, it's done it a dozen times in 5 minutes. (It's done it twice while typing this message.) The office BenQ response is that you need to use USB-C (which I was already doing) and to change from the BenQ driver to generic color LCD. (Which, in my mind, somewhat defeats the purpose of having bought a BenQ monitor in the first place.). So far, BenQ has been VERY responsive and I think they're eager to resolve the issue. But, I've been going back and forth with them for about a month and, so far, nothing has worked to solve the problem. Their official webpage documenting the issue is here: www.benq.com/en-us/knowledge-center/knowledge/how-to-fix-mac-m1-m2-external-monitor-flicker.html It's not my desire to criticize BenQ. But, since we're talking about BenQ connectivity, I thought it was worth sharing with the community that the USB-C connection appears to also be problematic for Mac users. :(. Hopefully they'll get it sorted soon.
Wow. I figured this for a belated April Fool and watched with a fading grin as the logic soaked in to me. Then I went into my graphics card driver settings. And guess what I found? Yep. Limited range. I switched it to full. Long story short... That annoying color cast I've been scratching my head over has gone. I could be psychosomatically imagining the improvement, but it's now crisp and clear in a way it never was. I now have the full depth of shadow and highlights that I should. And for that, I will be forever grateful. I'm a convert, I saw this with my own eyes. Thank you!
Great vid! The title of the vid kinda freaked me out a few days ago since my graphics card was connected to HDMI. Recently (ok, last night lol), I've calibrated my two monitors using the ColorRite i1 Profiler Pro, and they both achieved a rating of 2 & 5 ΔE after validation. Check! These monitors are not as high-spec as the BENQ, and they only offer HDMI and VGA connectors. To connect them to my graphics card, I'm using a high-quality Thunderbolt to HDMI cable. Surprisingly, the monitors appear to be color accurate on screen. So, I was thinking- uh...okay.... But after watching this the second time, I missed a little detail on HDMI settings, and I realize my HDMI settings was set to FULL already!! WHEW!!!! Being a headshot photographer, I'm constantly concerned about the quality of color output when my work is printed for my clients.
Glyn, I know this is a silly and off topic point but I have to say, Thank You, for pronouncing Consortium as con-sor-ti-um. Here in the US, all I ever hear is "con-sor-shum". Back in elementary school ('57 to '65), I learned to read and speak English, not whatever "con-sor-shum" is. These days, you hear "con-sor-shum", even in high level "professional" circles. Personally, I think their language skills are Tiit. 😁✌🖖
When you are relying on presenting your work online it is a hopeless endeavor. Your monitor may be perfectly calibrated but for the perspective customer what they see is in all probability limited by their monitor and the presentation of the online site. The solution to this problem is by your controlling the calibration and printing of your work. Give you work a chance to be adequately admired by presenting it as a your printed image or at the very least your printed portfolio.
Actually its depends on what color mode you Video Cards send to your display. And it work the same way on both HDMI and DisplayPort. Just make sure both monitors and video cards are set to the same Full Range RGB with 4:4:4.
I would say if you are not a techy person and want to have a simple input -output no error result, then DP or usb-c cable is the simpler solution for you…
This was so interesting. I'm having a lot of trouble with color match between my photos on Lightroom/Photoshop 2023 and how they look so different on my Huawei in Instagram. Same color profiles. Even the brightness is quite dark. It drives me insane. I have to edit in LR/PS, send to phone and then adjust them again in Snapseed before publishing them. Oh, my nervous system. Thank you for this video. Regards from Portugal.
I found the information informative I have my Dell P2412HB monitor connected with a Display Port on one end connected to the computer and the other connected to the DVI-D port on the monitor. The cable is Display Port to DVI-D. Not sure how much difference this makes never calibrated the monitor. My camera is a Canon EOS Rebel T5 1200D. I use the software that came with the camera on disks provided by Cannon. Upgraded to Digital Photo Professional Ver 4, and Picture Style Editor.
Reading some other comments I went to my nvidia control panel and discovered I needed to change to full range so I recalibrated and got even better delta readings 0f .75 delta. I think my success with passing before is that my hdmi connection is 2.1 , not 2.0 which might of helped make it work. Oh well, it was time to recal anyway. Those who have geforce nvidia cards should check their nvidia control panel to see what their current settings are.
He's just another spammer a-hole fishing for clicks by spreading lies, like the cable has anything to do with how you can tweak your monitor settings. U can make any 'interface' look like any other by changing the settings.
I have no choice , I have an HDMI/SVGA monitor with built in sound amplifier and sound DAC with built in anti aliasing when I don't need setting on graphics card to provide more speed..and HDMI is best choice , and not going to buy a new monitor when it still works.
Год назад
I'm using a Display Port cable for my Benq monitor... but it was just a matter of chance: it was the first cable I picked! 😜
I use display port for my windows machine to my BENQ 271 but recently got a Mac and tried to connect it with usbC. However, it didn’t work despite trying more than one cable so had to resort to HDMI.
Well I just switched both of my displays to display port. My video card has three DP's so it was easy. My 32inch designer monitor looks. Cables on hand.
Why don't we know about these things? We were never meant to! We don't know anything about technology these days. We just buy stuff, plug it in and hope it all works out. Knowing all this stuff about color makes this guy a pro. Pros used to sell their knowledge, not give it all away for free! How much did this guy study to know this? How many books did he have to read? How many late nights did he spent cramming for his student thesis and how much did his education cost his family? He is now condensing the most important bits of his knowledge and spoon feeding it to us for free by telling us what he knows. He could be fixing people's computers for a fee without whispering a word how he does it. That's how they used to do it in my country back in the 90s.
So presumably there would be a way to change the default output from limited to Full? If so, how is this accomplished?
Год назад
Hehm, I respect the author but I don't respect the clickbait title... Because in my case I did no changes whatsoever and the monitor still connects to the Mac via HDMI in full range by default. Actually I use USBC cable to connect the charger (Acefast 3in1 Wall Charger 65W GaN, a brilliant product by the way) which also acts as a USB hub having USBA port and the HDMI (4k/60) pass-trough for the monitor (BenQ PD3200U). So the monitor is connected with the HDMI cable. The monitor shows that is connected with RGB 0-255 range.
Bitchin' answer, 'cause I'm only able to connect to my Mini M1 by HDMI (DP port is occupied by the Windows machine, used alternately not simultaneously, duh).
Who knew? Found the setting on my monitor to reproduce colors values at 0-255, recalibrated, and the result is like night and day. The colors and tones display significantly better. Thanks Glyn!!!
@@davidfrisken1617 Intrigued as to why using the full range 0-255 wouldn't be good for photography. Forget the video as this video was about 'photography' hence the title
0~255 (256) is only 8 bits. 10 bit has 4x more values (2^10=10124) and thus way more color accuracy. Now you can stretch your dynamic range without losing color precision.
@@boudewijnj.m.kegels5198 not accuracy, just precision. 8-bit is enough for sdr content and photo printing (it's not hard to beat the dynamic range of paper)
But my laptop does not have a display port out, just hdmi . Update, I actually have a Mini display port, so a cable mini display port to display port will do the trick?. I’m ordering one right away!😮
based on this presentation I switched my BenQ PD2700 monitor connection from the HDMI port to Display Port and to the USB-C port on the CPU (M1 MacMini). A huge improvement. Thanks!
Windows OS content is always limited it could be in a full range container BUT then raise blacks appear issue that could be chained so content like games can be "double" limited;
Few people even remember the DVI cable connection in the back of some monitors. This is a true digital interface and also sends trucolor to the monitor in a much better form than simple VGA or HDMI it used to be that some video cards wouldn't let you run certain video modes unless you had a DVI cable
HDMI & 'Display Port' & DVI & even VGA can all look about identical (VGA being slightly to a lot fuzzy & ghosty, depending on cable quality) & it's just about the settings, like the 'luma cutoffs' & the 'bit depth', which has nothing 2 do with the connection(s) but the $upport of the $ystem. If something looks 'wrong' U can ADJUST it = fux sake. U can even tweak the gamma curve in software 4 free.
Actually HDMI is copy of DVI that was paywalled by patent trolls. And also there is no such thing as "limited range" in both DVI and HDMI specs. DVI always carries RGB888 data in 8b10b(TMDS) encoding.
I've been using Dell premiercolour (UP-Q series) monitors for over a decade, and have always use displayport due to the limitations of legacy HDMI revisions (bandwith, resolution, colour space/bpc etc.) - So I guess I dodged this bullet too, because I've always had good quality DP cables and purchased motherboards with DP as standard when building new PC's. Great informative video, thank you! And as another commenter said: I too also switch to 10-bit colour (in Nvidia app) for my DP connection. Found under Display>Change Resolution> Use Nvidia Colour Settings >In the Nvidia settings app.
@Twisted, you’re exactly right. ATI and Nvidia both make reasonably priced “workstation” cards that have display port output. If you are serious about colour of your monitor and being able to calibrate correctly 10 bit per colour channel, display port is the only way. Thankfully Nvidia with their RTX series cards now give you the option of either gaming or studio drivers(for 10bit output) at good pricing. I’ve had a NEC PA271w wide gamut monitor for years and now a 4K Dell studio monitor with built in calibrator. The gold star for photography editing monitors has always been Eizo, but kudos for Benq now making monitors for photographers. Checked Apple, their current displays don’t have/support Adobe RGB colour space? As for why not? Especially if you want a colour calibrated workflow PC is the way to go.
I don't get it. HDMI is based on DVI, and I read DVI specifications and implemented transmitter on FPGA. There is no "limited" range there. DVI(and HDMI because of backwards compatibility) just sends 8 bit colors encoded in 10 bit symbols. That's it. There is no "limited range", there is only RGB888.
BS. There is a setting in the NVIDIA control panel for Full/Limited output dynamic range choice. After switching to Full and calibrating there is no problem with HDMI connections even if you use large IPS TV as a monitor. I am using TV together with a normal display and calibrated colors look almost identical on both of them.
Adding to all the comments: if there is an issue with limited range instead of full range, and Benq knows about the issues appearing ... why doesnt Benq only allow full range? The signal format between a graphics card and a monitor is negotiated as soon as a connection is made. Via EDID. Why does Benq allow limited? This only makes sense for cheap tv monitors. A monitor for photo editing should ask the graphics card for: At least 10 bit color depth. 4:4:4 color sampling. Only full allowed. No limited. RGB only, no YUV / YCrCb. Blaming this on HDMI is a big mistake. And btw. recommending USB as a replacement is a big mistake, as USB sources tend to color subsampling to save bandwidth. It is ok to point out the issues a reduced bandwidth can have, but it is a bandwidth and video format issue. Not a what cable to use issue. Imho, this video can be a starter for a whole set of videos, explaining color depth, color space, bandwith, 8bit vs. 10bit vs 12bit and much more. Many of the buzzwords are just thrown in marketing speech at the customers (not blaming Benq for this, blaming all manufacturers) to get them buy their product. Without knowing how to adjust your tool correctly, you may spend a lot of money on hardware without any benefit.
Really weird that he did not mention HDMI 2.1 that support 8k 60Hz Full range RGB. All GPU's from 2020 have it, Rtx 30XX etc, and ofc the monitor/TV must have a 2.1 port to. Most TV's also have a PC mode where picture enhancers are turned of. And refresh rate is also a huge factor combined with the resolution, so if the GPU that you have is HDMI 2.0 lower the refresh rate. Not sure if refresh rate is something that is important for photo editing.
refresh rate of a monitor is only important for gaming and scientific purposes in which moving images must be analysed with detail and accuracy while being live. For normal video (24; 25; 30 or 60 fps) a 60 Hz display is sufficient. Some will argue the fall-off of the single frame is more prevalent in 60 than in 120 Hz, and 120 is of course 5x 24, so it matches better to the old film frame rate of 24 fps.
@@boudewijnj.m.kegels5198 i agree with everything here. Ill also edit my self because Hdmi 2.1 support 8K 60Hz, so u can probably do 4k 240Hz and get full range RGB. The introduction of variable refreshrate (VRR) have made fps drops almost not noticeable, but G-sync is the best if u can afford a monitor that support it (not to be confused with G-sync compatible that most gaming tv:s and montiors have, but its alot better than V-Sync) The thing i wanted to point out was that to get the full RGB color range and not YCbCr4:2:0 in 4K with a Hdmi 2.0b , u can drop the refreshrate to 30hz, i was not sure if my tip would be a good one for photo editing if refreshrate impact anything (and photos are still image so my guess its not).
@@boudewijnj.m.kegels5198 you have YCbCr4:4:4, 4:2:2 and 4:2:0 color formats But the downside is that u have limited range and not full dynamic range. For most people this is not something they care about. They buy something, turn it on and will not enter the setting menu to change the picture setting from standard or anything else. Funny thing is im color blind, red/green. but picture quality and optimizing my viewing and gaming experience is top priority. edit: it may be that i just assume people understand that when i say full range RGB, its a combination of color format an dynamic range. so yes you cant have both rgb and ycbcr
So this would also apply to connecting your camera with the computer via HDMI, isn't it? Do modern cameras have the full spectrum 0-255 at their HDMI outputs? Sometimes they even come with a micro HDMI output. Are their full spectrum micro HDMI to HDMI cables then? What good is it to load the Fotos or videos from the camera to my computer via HDMI, they might be color inaccurate from the start, isn't it?
It’s not the cable it’s the hdmi settings. Check out my pinned comment. If you’re transferring from camera to computer what cable you use has no effect on colour.
The title is misledding Should have been something like: "beware of your color format settings when using HDMI cables to connect PC to monitors/TVs" All you neet to do is set the graphics card output format to '4:4:4 PC color format', and of course use the native resolution.
No idea why, what is essentially “broadcast safe” signal levels are set by default regardless cable.
Год назад
I’m the current maintainer of DisplayCAL (an OpenSource display profiling system) and DisplayCAL detects Full vs limited range and does its thing accordingly and if you are into colour calibration you generally know about these things. I don’t think it is a good suggestion to not to use HDMI just over that.
DisplayCAL is a well-known cross-platform alternative (based on ArgyllCMS) to proprietary commercial display calibrators. It supports a wide range of devices, monitors, and operating systems. DisplayCAL has been relied on for many years by enthusiasts, installers, and developers of custom color calibration software.
I agree with some comments, the video is misleading - it's perfectly fine to use HDMI I have done video editing and photo projects that required high accuracy and never had ANY issues at all. From my understanding limited range is used for video, as video is recorded as such, and full range for computer graphics and images. On the monitor I use I can select between both full and limited, and can even do so directly from the NVIDIA drivers. You might have to switch between modes as watching videos on full range is not desirable either, so perfectly fine to use HDMI, it's just that you want to be in the right mode to calibrate your colours this was equivalent to IRE 0 vs. IRE 7.5 in the analog days.
I have a MacBook Pro and two BenQ monitors. I don‘t see there any way to change these RGB range settings for these monitors other than colour space. As far I know for 4k i need only a newer HDMI cable. My calibration was validated, so why panic;-)
What to do if my laptop doesn't have a DP port but my monitor has it. Can an HDMI to DP cable be useful in this scenario? My monitor doesn't have a USB C port. Please help.
DisplayPort and HDMI ports are functionally equivalent, while some USB-C ports can also be configured to work as video ports. Just use the appropriate cable to connect each component.
Probably being a bit thick here Glyn, but if the problem relates to the wrong cable between your PC and the monitor, what about the cable used between your calibration equipment and the PC?
Of what importance would that be? It's reading out a certain signal and sending it to the monitor, but it's not exactly constant streaming live feed. USB-A./B would suffice C is overkill, but probably used anyway on newer models.
If You just use the correct wire to connect with it will be calibrated just fine!, the metals in the wire that you use matters!, using cables with ( Gold ) connectors is not good!, some copper wires are not good but using steel wires with copper is good!, so, if You use Your Manuel for your TV or computer graphics card it will tell you what type of wire to use!, All Done!,✨
Wrong. Graphics card settings can be set to default so they need changing on Windows if you want full range. Mac doesn't allow the change so you need to use a different connection method.
I fail to see how this is the fault of the HDMI cable. Your display card should be able or enabled to send a full range signal and your monitor should be able to receive the same. This just seems and sounds like marketing for Benq displays