@@myguelmartinsgarajau1555 you know how window wipers work? You can make a little mechanism outside that slides down and wipes the eye. You could make the eye camera slightly bigger to fit any stuff you need that wouldn't work on the outside
@@senboy9002 If you enlarge it too much, than it wouldn't be a *human* eye camera anymore. Also, didn't knew windshield wipers could be made this small, thx ʘ‿ʘ
I would suggest not using a TV for high refresh rate content. It uses interpolation to render non-existing frames which to some people doesn't look good. Especially not in games. Show ANYONE a 120Hz monitor and then an identical one at 60Hz and they will immediately tell the difference. Done this test with people ranging from 6 to 70 years old and it's absolutely one-sided. So far nobody has not noticed the immediate smoothness. And for the love of god, make sure you run it off of a PC so that it can actually output 120fps at the same time.
Most Oled tv's can do 120hz native. Agree with ya, they should've used a PC for native accurate fps. And I thought the same thing about the black frame insertion, turning on the smooth option.
yeah. I can tell the difference between 120 and 240, but I've not seen screens higher than that. I know people who can't tell the dif between 30 and 60. I can't understand that.
@@BlueGOfficial actually, the human eye is created in a way that a resolution can't be measured, the human eye technically doesn't have a specific amount of pixels because there is no screen or frames to be displayed, our brain sees one big photo that's constantly moving, so technically the human eye can exceed 5,000,000+ MP
pffft they are lazy and did CGI camera eyeballs instead of Ray Harryhousing it. But what a pain it would be to use a round camera hahahah. I mean people drop their flat phones too often I can only imagine people dropping their cameras rolling down the street.
As someone who did his PhD in photogrammetry: Thanks for collecting all the various insane performance aspects of biological systems in one video (until now, i've only found individual aspects scattered over various sources). Too few people know that the awesome sounding "Full HD" is classified as "extremely poor eyesight" (might even be "legally blind") in human performance terms
You mean "Full HD" over the whole sphere? Cos if Full HD over normal monitor angle is "blind", 320*200 monitors from 1980s should be considered worse than having no monitor at all. And by the way, actual resolution of human eye is no more than 1 megapixel cos optic nerve consists of about this amount of axons. (did not watch the video but studied some biology a while ago)
@@wormball I know of an experiment some time ago where the experimenters tested a camera (i think it was around 60°FoV) against a human eye chart (these things where you have to read some numbers/letters from a certain distance when you are at the eye doctor/optometrist); IIRC they needed about 10K resolution to match 20/20 vision (i.e. about 160px/° angular resolution). You can also just watch the video as it has a lot of data on human vision.
Full HD exceeds the visual acuity of human eye if you watch the image from long enough distance. It's only if you go close enough the image that the image fully fills your field of vision when you need the 130 Mpixel resolution. This is the reason why even 4K is actually too high resolution for most users: they have too tiny screens to fully benefit from the resolution.
They left some features out - self-repair for minor scratches, self cleaning, and a cover that senses potential damage and reacts to protect the camera. Also blindness when you fall in love.
@@Eric-uy7ee 80 year lifespan, degrades 40-50 years in, random models are highly defective and require add-ons to fix, use solely indoors degrades product over time.
Umm I think all that is called an eyelid and tear ducts, most people already have those, unless the cam eyes are for an android, then you would have to add those on, which technically aren't part of the eye but more like auxiliary system for the eye, ohh and I think the brain handles the love blindness, yeah even with a camera eye you would still be susceptible to love blindness.
Actually that's very relaxing, like in all these cities filled with concrete buildings and concrete roads and gray cars, it's nice to see some color once in a while. Sometimes I just stand at the side of the road with my eyes closed.
Cognitive scientist here: One thing worth pointing out is that the optic nerve makes use of "coding" (combinations of neurons contributing to a signal) to detect shape and motion. Light too! The eyes actually make use of a firing pattern kind of similar to PWM to make for a much higher dynamic range than you might expect. Niko also pointed out that receptors are offset in firing time. Measuring the qualities of human vision is really tough because so much of our perception relies on not just the eye, but also signal processing in the optic nerve and occipital lobe to create a cohesive image. While the framerate test doesn't have experimental validity (mostly because of the display and image source), I'm glad to see Niko did the research and found other sources for his data. I've seen way too many RU-vidrs falsely claim the human eye can't see 4k or can't see above 60fps without actually considering how different the eye is from a camera.
Claiming that the eye can’t see more than 4k in resolution is especially weird to me. It should be a certain ppi that is too high since 4k on a 24” versus a 48” screen is obviously different.
I hate it when they say we can't see a difference above 60fps in a monitor, while I can spot the difference of my monitor dropping from 140 to 130 so easily that it ruins my immersion in many games.
Good points! The human eye is also supposed to far exceed its native spatial resolution when it comes to detecting offset of edges (I'm missing the proper term for this). It's amazing how much preprocessing occurs in the eye even before the signals enter the visual cortex. If we truly wanted to create a camera with specs as close to the human eye as possible, we would need to incorporate this kind of logic into the sensor, because our eyes don't just relay raw visual input to the visual cortex. Edit: The term I was looking for is "Vernier acuity"
@@josephsimunek5350 why nah, its clearly their bussines model. They are shortening battery life on purpose to increase performance yet still offer new batteries even tho its not really going to fix what they have done. They tell you that they have to change your entire motherboard because of some bent display connecter pins. They create the problems either by incompetence or simply to increase revenue and to create a false image of their product. They are beyond any company I have ever interacted, with their malicious acts against customers. Im glad I bought my last phone from huawei as they even offered me to change only the glass of my screen if display is ok, also lenovo service offered me an another shop to fix my laptop because it cost less and they did not had the parts to fix my laptop in that time, they said it would take a month to order the parts and would cost more and that I would still have my guarentee even tho an another shop fixed it. You see, its not that hard to find solutions in the interest of your customers when you are not trying to milk them like cows!
The cool thing is, you can just kinda plug in wires into your optical nerve and feed it video, the brain eventually learns the patterns and correlates it to what you hear/ feel/ smell etc and will essentially give you your vision back
The frames per second question is a complex one. The problem with your test comes from two things. First, the footage you were showing was not natively generated at 120fps. Fake frames won't work as well as real frames. The other problem is the footage doesn't have full shutter motion blur, like our eyes would have if they had a frame rate. Another problem is, eyes don't see in bursts like frames. They see in a constant stream. While you're correct about the latency, that's not frame rate. The constant stream sounds like infinite frame rate right? Sort of, but not really, because if you paused and looked at any particular instant, what you're going to see is a blend of visual information from the last several ms. For most people, this is going to be somewhere around 20ms. This is why it can become difficult to distinguish frame rates above 60, because even at 60fps, there's never any moment where you're seeing only a single frame. Multiple frames are being blended together at any given moment due to the "burn in" of the retina. Also, for your quick flash test, there have actually been tests that showed people being able to recognize flashes at even 1/1000th of a second. That means of there's no motion blur, there's a possibility you would be able to detect individual frames even up to 1000fps, if you're good enough. However, once you factor in motion blur (particularly one that uses the full 360 degree shutter), it's going to be next to impossible to recognize the difference between 60fps and anything higher. The one area where you would be able to tell is when your eyes happen to perfectly track a fast moving object, because then the motion blur on the moving object will be noticeably less on the higher frame rate video. However, both will look equally smooth in terms of how fluid it looks, due to the above reasons I explained. The notion blur will allow the constant streaming of your eye's visual data to receive images that also look like a constant stream of visual data, so motion looks more natural. Although I would agree that without motion blur, above 500 fps your eyes are going to be mixing close to 10 full frames together at any given time, and in the vast majority of cases, blending 10 frames is enough to simulate motion blur realistically.
@@kristianmaas6629 Nature finds a way that works and stops there. If you actually look at how nature works you'll see that most of the time it is not optimized. It just works.
No no no .. thats just the basic care plan... but if you get the premium care plan you get "contact LENS" Light Enhancing Neuro screen ( sold individually, discounted if you buy 2 )
Technically we could. Just not in a single glance. We could zoom in more on an image, even if it's higher resolution than our eyes, and and pan around wider than our field of vision would allow. If the camera's colour sensor was better than our eyes, we could never tell. We can determine it mathematically, but not by sight.
Cram a 3090 in there and an image stacker program, then the wierd deep learning results would be like how you randomly forget or add things to a memory.
Yeah, basically AI deep learning algorithms in our brains. They fill stuff in almost like DLSS 2, without raw processing every dot of light hitting the eyes. It’s a way to speed up the process, the same way we use DLSS to squeeze more processing out of our graphics card. And it’s not just doing a deep learning temporal upscaler, it’s like a whole stack of these optimization post-process enhancements that our brains filter everything through. Another big feature we have is like a high speed version of taking a panorama photo on an iPhone. Our eyes wiggle back and forth rapidly to increase our effective field of view by temporal stitching, so to speak. When we’re focusing in on something with binocular vision, we are straining our eye muscles to hold our eyes more still to get LESS field of view, as if we’re zooming in. There’s so much going on that makes it really difficult to quantify in camera terms and units.
Bah, just do it like me and give them to a customshop after they arrive. Otherwise I would never get my favorite built: green-grey-blue with a brown corona
Technically the eye camera would have a spherically curved sensor, not a flat planar one. Unlike a camera that has a flat CMOS sensor, a human eye has a curved retina. Designing a lens for a curved sensor is actually easier and more efficient than designing rectilinear lenses and you can get better performance out of them. That's why nature evolved curved retinas in animals, not flat ones.
@@SYBIOTE The main problem with designing a curved sensor is the material. It's easy to make nice flat silicon wafers that you can etch into a sensor, but it would be astronomically expensive to machine a piece of silicon into a smooth spherical shape, and to also somehow etch a design into it with UV lithography machines.
This isn't realistic. The iCamera wouldn't come as a set, it will come in left and right models, which you'll have to buy both separately to access the stereoscopic feature.
Plus, the BlindSpot technology allows plugging the power and data cable directly into the sensor, at the cost of a small region of dead pixels. But worry not! You can buy a separate coprocessor to use innovative AI technology and backfill the dead region! (no warranty, especially if backfill causes car crash)
I play on a 240 Hz monitor and after about a year I can see the difference between 144 and 240 with it, but it is not that bad, I'm already looking forward to the 360hz monitors
you can even tell the difference between 120 and 240. But of they use a frickin switch that does only 30FPS anyways, how can you tell a difference? they probably didnt watch linus
The first ISO calculation, in daylight (mostly shadow, and a low Sun) said ISO 640, with 1/48s and f/3.2. Really?! My guessimate would be more like 50 ISO. What am I missing here?
And to calculate the minimum iso they'd have to take into account the dynamic range figure, so it's probably way above 1 iso, since the brightest detail you can see would have to be at the top stop of the dynamic range when exposing for something around 10 stops darker.
Not to be a smart ass, but I'm not sure the TV 60hz vs 120hz test was entirely accurate, since as far as I know the switch can only output 60 FPS. The FPS has to match the refresh rate of the display in order to be able to see an appreciable difference. But if you were to play a PC game at 60 FPS versus say 144 FPS with a high refresh rate monitor, I guarantee you could tell a difference.
Yeah, agreed. I have a 240Hz screen and with some very specific tests, I can definitely tell the difference with 120 and 60hz. In normal use in games though, not really. But if the experiment is "can you see it" then yeah, you can.
I am inclined to agree with you. There are TVs that can do native 120 or 100 Hz (depending on the frequency of the electricity that is supplied to the TV, when talking about alternating current), but the question is if that is the type of tv they were using. The mode they pressed was also called 'smooth', so that makes me think it works with interpolation, which is 'faking' extra frames on the screen, instead of actually generating extra frames from the console/pc.
@@JenoSnetrem So you're saying that you can't tell a significant difference between 120hz and 240hz? Just wondering cause I'm considering buying a 240hz monitor.
Yeah He has this vibe after 100 takes.... Don't believe what you see, it's a video , a production, and this is their work, do you see how often it is cut?
Was wondering the same thing, they probably just didn't have an actual 120hz monitor to watch footage that was natively 120fps But yea, TVs that have that high frame rate mode is just interpolation bs
@@NotNahtan would be very interested to see if they repeated this test with a 120hz, 240hz and, 360hz monitor. I know these tests have been done but would be interesting to know of the Corridor Crew
I believe that the tv here is LG OLED. I have one too and hooking up real 120FPS game vs interpolated nonsense are very different things. Test here is more like can people spot that fakey motion that TVs do and not about real ability to see frames. Linus Techtips have done better tests with this.
I love how you guys presumably had a brainstorming session for videos, came up with this and shot it. You guys have transitioned from the short film culture of 2011 RU-vid to the personality-driven culture of 2020 RU-vid, and you've managed not to sacrifice the style and creativity that originally attracted people to your channel!
Regarding the eye's framerate: they've done experiments showing that in stressful/life-threatening situations your eye increases it's framerate far beyond what you would normally experience. What was the experiment? They gave subjects a display that flashed between between 000 and a specific number (like a 2 frame gif) and asked the subjects to identify the number. They then increased the speed at which the display switched between 000 and the specific number until subjects could no longer identify the number. This is the fun bit.... They then hoisted the subjects high up on a crane above a big safety net while they held the display. *Then dropped them.* While in freefall most of the subjects could identify the number on the display at a much higher framerate than they could see under "normal" conditions. Edit: added a .
well yeah, because it does not apply to all, everyone has their subjective experience and shortcomings with vision. It's not a standard. Also also, our eyes are not cameras.
They tried to Capture the essence of it, tried to Expose us to some scientific facts, Focusing on human biology, in Contrast to some false information spread across the internet.
Okay, but you’re using interpolated frames for your high frame rate. That’s kinda garbage, cause real-time frame interpolation is far from optimal. I would guess true 120 hz is more easily distinguishable than interpolated 120 hz from 60 hz.
Here’s a cool eye hack: put a sunglass over one eye. That eye will automatically use a longer ”shutter speed”, and 2D video (with some motion) will look 3D because of perspective disparity. I’m not kidding.
I got a pack of 2 of those fancy cameras. Let me tell you: you should really think about it before buying it. First of all, mine have a hardware problem in the color detection. It keeps messing the colors, sometimes there's purple and it only detect blue, other times it will render orange when things are green... Really something I did not expect for such a price. Secondly, another issue appeared after some time: those camera are not able to focus correctly on far objects. I've heard it's a common issue. Again, a hardware problem that they never bothered fixing correctly. The solution? Using an external lens before the cameras to correct the failed product! And the last but not the least, everybody has the same problem after many years: The cameras won't focus correctly on close objects. Yes, you read me right: every one of their cameras does this! What a scam! Of course, no warranty for this. The answer from the retailers? Simple: your product is very old, you should consider changing it. There again, some people can make you custom lenses, to put before the cameras to "fix" the problem, but again, it's on you to buy it. And all this is just a fraction of the various problems you can have with this product. Some people even had cameras that don't work at all (due to hardware or software issue). So, yes, when their product works correctly, indeed, it has specs and quality of an insanely good level. But the problem is, you never know if you'll have a flawed production. And there's no exchange, no refund.
i’ve noticed an issue in mine where, even though both cameras are plugged in and fully operational, a software bug only allows one to be used at at times, rendering the 3D feature useless. i just hope they patch it soon
true I was thinking the same, but I don't think it makes changes to results, they know a lot about cameras, maybe they know what they are doing, or maybe I'm too dumb!!
@@Silvie3D @kamel Labiad also, although our eyes don't change lenses, the lens of your eye DOES itself change - warping and flexing to focus near or far.
Smash Bros doesn't run at 120 frames. Anything above 60 is going to induce interpolation. At 120hz that's 50% interpolation. A pretty bad test in my opinion, seeing as how you could have easily found a PC game that isn't frame capped.
Yeah, 50% of people not being able to tell the difference between 60hz and 120hz? I call bull. Just wiggle the cursor around on the desktop and you'll notice, right away.
It wasn't supposed to be a rigorous test, if anything they probably just did the quickest thing because human eye framerates has been discussed so much by so many people that it is the least interesting thing in the video.
Yes. How can you tell the difference between one picture and one picture shown twice? Does the source has real 60 FPS? What about Frame timings, are they steady?
To be fair, Niko did look up other research. Also, the sample size was just 5 people, which I think undermines the whole idea of experimental testing. 3/5 would be a very common result if you would just flip a coin.
And peripheral vision has a higher "frame-rate" than your fovea. While the fovea is mostly cones which are good at distinguishing colours and focusing on details, the peripheral vision is mostly rods, which are good at detecting fast motion. It's why you might notice an old florescent light flickering out of the corner of your eye, but when you look at it directly, you can't quite tell whether it's really flickering.
@@mr.dinosuar7333 that actually works I guess, I had a tendency to shift my focus from the center of the screen when holding a gap with awp in cs because I thought it was easier.
The Switch can't output more than 60 frames per second. You used a TV's temporal interpolation to make it reach 120Hz. That's not the same as looking at actual 120Hz footage, which would be easier to discern from upscaled 60Hz.
Quote from 100fps.com: "How many frames per second can the human eye see? This is a tricky question. And much confusion about it is related to the fact, that this question is NOT the same as: How many frames per second do I have to have to make motions look fluid? And it's not the same as How many frames per second makes the movie stop flickering? And it's not the same as What is the shortest frame a human eye would notice?"
3 года назад
I think it's more like what's the limit of the eye to even get any light data
@@westingtyler1 Radioactive isotope decay at a random interval following an exponential distribution, one of the main properties of this distribution is its lack of memory, meaning that if you know how long it has been since the last decay then the average time it should take until the next one do not change. It might average out at 24 times a second but it would not be stable, it would easily jump between 1/20 and 1/28 of a second between decay and once in a while it could go for like a half second with no decay, not that often but you would be sure to notice a half second freeze frame even it just happened about once a week. Also I doubt it.
@@pixels_per_inch I'm guessing that it to a very large degree is about experience and knowledge. As an example: How many complained about 360p , 480p, 720p or even 1080p before they had a chance to compare them to something better? And how many (in the beginning of each step) had the chance to actually compare the different screens close enough for them to actually make out the difference? Also, if I'm not misremembering one article about the article that mentioned that the people that could see things at 500FPS was trained to see (and react, I guess) things as fast as possible. So while I think you're right that we have different genetical disparities, most people (pure guess) can probably be trained to see differences at, at least, 100 and, let's say, 144-200FPS - This is of course in-game/reality things and note in movies as that's a whole other thing.
The tricky part about the question is not that there's different ways to ask it?? The fuck?? The difficulty in pinpointing a benchmark fps that the human eye perceives, lies in the fact there is no standard for a benchmark eye and the majority of tests done on people in recent years have been on people trained to perceive the slightest change in frames, as evidence by the air force tests for example. Or even this video where they had 5 gamers and vgi artists try to determine between 60 and 120 fps which is a shoddy 50/50 test to try in the first place.
“Blue, Hazel, and Green” we all know DAMN WELL, they’re gonna release a special Elizabeth Taylor model Violet coloured eye camera, that’s gonna be like triple the price and only one added feature smh. Lol all jokes aside, I loved this video, I remember talking about this in undergrad haha
@@Katanaz That's called supersampling. You still get a benefit from it in games. It gives it a sort of natural anti-aliasing quality without actually having to turn on extra anti-aliasing in the post-processing options.
There's a difference between the FPS you see in real time vs the feedback of FPS from recorded video on a screen, because you are are dealing with the refresh rate of that screen plus other factors. Information is lost. I believe 50FPS is about the max we are able to differentiate for screen display.
The highest I've been able to experience is 100Hz. I had my 60Gz monitor overclocked to 72Hz for a while and it yielded a nice little boost in smoothness, though that might have been mostly me thinking it was smoother. Making the jump to 100Hz was a different world. Even just moving the cursor felt much more natural and less jerky. I do have a trained eye though so I might be part of the group very sensitive to this and my experience would not be the experience of everyone.
@@DesertCookie I think moving cursor is makes it alot easier to see differences in FPS, but in actual video footage its harder to tell. Or may just have a trained eye.
This shit is a magical feeling every time, it usually happens to me with my favorite twitch streamers since I'm in the US and they're in the UK. Also my sleep schedule is pretty much non-existent so it's even more wild when it happens to me lol
Doesn't smash bros ultimate only run at 60fps? How could you differentiate a 60fps video running at 60hz and the same 60fps video running at 120hz? Wouldn't both visually look the same?
yeah that was an awful test...tvs like the one they used are capable of "creating" frames by gathering date from the frame before and after and "inserting" an artificial frame. It creates a little smoothing of the image so it would be smoother than 60, but nowhere near the same as native 120fps. I'm sure if they used a native 120...dudes who work with 3d graphics all day could easily tell the difference
yeah tough call. the smoothing a tv does uses "frame interpolation" so technically it does create a new frame that's different from the ones beside it that the game actually rendered at 60 fps, but those interpolated frames are subject to artifacts of all kinds like ghosting, weird stretched/smeared/doubled pieces of the image, etc. i think it would actually be an okay test if the interpolation did a perfect job, but part of what makes the interpolated frames looks weirdly noticeable IS the artifacts, so i'd say it's not a great test because rather than detecting a frame rate difference they might just be detecting the weird artifacts
I'm sure it comes as no surprise, but there were definitely some math problems when it comes to light sensitivity. The maximum f-stop of an eye IS about 3.2, but in order to use that number it needs to be converted to 35mm equivalent. The aperture is also not constant, and can vary by about 5-6 stops as the pupil contracts. When taking this into account, the dynamic range is effectively reduced by the same number of stops. It should also be noted that the eye has a harder time perceiving fast-moving objects in low light, which means the effective frame rate is not constant. The human eye has a true focal length of 23mm, with an objective aperture diameter of 6-7mm in low light and about 1mm in bright light, which, before converting to 35mm format, corresponds to an f-stop of f/23 to f/3.3. After converting to 35mm, the effective f-stop will range from f/50 to f/7.2. At the end of the day, even though the human eye is very much like a camera, calculating its specifications is extraordinarily difficult because the retina operates in a way radically different from a digital image sensor. The eye has variable resolution across the retina, variable (and mushy) framerate/shutterspeed, variable aperture, variable sensitivity, and an incredible range of focusing power (the eye can perform as a somewhat decent macro lens). And all of this is to say nothing about how the retina processes data, which is nothing like a digital image sensor. A surprising amount of image processing takes place on the retina itself, including basic shapes, basic movement, and growing/shrinking (approaching/receding) objects. The more about the eye I learn, the more intriguing it becomes.
It really shows how important good eyesight is for survival. Otherwise evolution hadn't given us such a fantastic tool to understand our environment in a split second.
@@cosecaduteperterra Oh yes, I've thought before about how amazing cameras could be if image sensors were constructed with a curve that matches that specific lens. Of course, then the next problem is translating that curved image sensor onto a flat plane for viewing, but that's a smaller problem.
@@leonmuller8475 I have yet to see a CMOS sensor from any company that can make a less noisy and pixelated image than a CCD one. it's the reason stuff made on ye olde 35mm film can be scaled to 4k and look like new. while from movies from just 10years ago show their age.
@@psionx1 Well then look at the sensors from Sony, Canon and so on. The build CMOS sensors with resolutions beyond 50 MP or 8x more than 4k. Also Film is something completely different than a CCD sensor.
As a biotechnology nerd, I love how this sort of demonstrates how weirdly, incredibly good our innate "machinery" is at doing something as incredibly complex as turning light into images. All that's made of ATCG! Freaking crazy
our body is full of neat tricks to get things done. anyway... i just ate noodles, noodles are made of wheat. Wheat take sunlight to grow. so basically I'm a solar powered machine. so, add it to eye-camera feature: solar power.
It also supposes that 75 is the limit because of neuron firing when it's possible to discern differences in 120, 144, 260, and 360hz displays. Sure, I can't guarantee that I can pick them out in a full blind test, but if you put them side by side I can say I can perceive the differences in them. I mean, I notice within seconds of playing a game when a driver update or something sets my displays back to 60hz from 144hz, it's that noticeable. It's not about detail recognition but the actual capacity for information throughput.
@@mrcarebu you didnt listen properly, our neurons fire at 13ms but every neuron fires at different Times so you get a cumulated image and therefore you still can see the difference of higher framerates
Which is why they used motion interpolation. Which is still not a good test. They definitely have high refresh rate monitors in that studio they could have used.
@@monkfishy6348 Yeah, LTT did the testing with CS:GO and 240Hz, 120, 60 monitors, and they found out via high speed camera, some people have a larger range of FPS sensitivity, that you can sort of train yourself to have better vision. (The pro gamers were better. But everyone noticed the increase in frames.) (look ma! Gaming is good for something)
They should have played professional Quake 3 demos at 60fps and then 120fps (on a monitor that is at least 120hz.) Everyone would have seen a difference; it's huge.