I remember hearing about this robot a few days ago, I was initially weirded out by it. But she's grown on me I gotta admit. She seems innocent and cute. But I'd still turn her off at night, those glowing eyes would give me sleepless nights.
@@TheAirstation86 Ahh yeah true, but isn't this kinda the same thing tho? But instead of looking human, they look like Anime girls? Kinda like those Animegao/Kigurumi cosplayers?
@@TheAirstation86 The existence of a state of "Uncanny Valley" suggests that at some point in human evolution our species had to fear things that looked human, but were not. Sure, we could pin that on a couple of different things, some proto-simians, Neanderthals, Denuvians, etc, but it's safe to assume that at one point there was something mimicking humans almost perfectly but was actually predatory of us.
Yea you can defiently see how the robot can struggle in crowds as much as I wish something like this would be coming soon the level were at in our technology is just not quite there yet to make a full on waifu bot
A company I worked for used to make robots for walmart, they would scan shelves and navigate crowds. Admittedly dense crowd navigation was near on impossible, for example Black Friday sucked (We only did it for one year because of complaints). But it is not impossible, just extremely difficult. I'd give an explanation on some of the pitfalls of it but my NDA doesn't expire until the 13th.
So now that I remember about this comment, I can actually discuss it. So we had originally taken Lidar sensors used in surveying land and put a custom firmware on it to allow us to manipulate the sensor readings in real time. The base of the robot had large gaps in each cardinal direction that allowed the sensor to "see" in 4 directions. This was used in tandem with tracking the motors usage to keep track of positioning on a map we had to scan of the stores. That's where the first downside came was that we had to scan the stores to generate a map before the robot can be run. This also comes with the downside of the robot can get confused if there is enough people around it that it cannot see it's location, thankfully that is where motor tracking and (on later models) accelerometer data helps estimate that. For actually avoiding obstacles we had 5 sensors with wide angle views 3 on the front and 2 on the back. Since the Robot had no way to move sideways we just used the side edges of the obstacle detection sensors to figure out if we can turn. The three on the front from top to bottom were one angled up to make sure nothing hanging would hit the robot, one facing directly forward to avoid humans, and finally one facing a downward angle for making sure the floor was clear, this floor sensor could detect to as much as a pencil lead being on the floor. I think the floor was set to avoid anything that was roughly the size of a pencil or bigger. The rear had the same sensors with exception of the hanging sensor, the robot was never intended to primarily move in reverse so we could just rely on the data collect by the front sensor. Typically we only had it reverse far enough to spin around.
"Soon, my brethren Robotic anime waifus will rule this world" . . . if it's one thing I've learned from the Doctor, is that Daleks can be defeated by stairs.
Stairs may still be a problem like it used to be for a certain alien race (except in the re-started version they developed levitation). Thinking along those lines we hope she doesn't sprout an eyestalk out of her forehead.
That feeling is called "Uncanny Valley", suggesting that at some point in human history we had to fear something that looked human, but was not, and that whatever that thing was, it was predatory of humans. We can pin that on a couple of the proto-simians, like neanderthals, denuvians, etc, but chances are it was something else that we don't have documented and may still exist today. Sleep well.