so , the camera we are viewing, is what ameca is using to see the peripheral of the room. in short, we can see what ameca can see. this is also why the eyes are not tracking correctly with the objects being shown. and everything off screen is not detectable.
Crikey just seen this, I know it’s just a test rig but nine years ago and that walk looks markedly more natural than Optimus (and others) looks in 2024. I wonder how far from a naturally walking Ameca (or later derivative) we now are especially as another video some 5 years ago it was stated they were still 4 to 5 years away. Much more advanced than I had thought, though one presumes ‘naturalness’ when carrying a body above is a far more complex equation.
The movements are amazing!! I wish that the upcoming humanoid robots will all all have such amazing facial capabilities, but I wonder how much maintenance this needs. ❤
Interesting thought. I know Engineered Arts intend to have their robots walk but I wonder what would be possible if they worked with a company that specialised in that side of Robot development. That said with the amazing close to human behaviour of these robots, unless the mobility was to a similar standard the overall effect may be less than effective and indeed detract from that sense of humanity from the overall package.
This must be scripted and read by human people. They're just using the robots as puppets here. Disappointing, in my view. Not a real conversation (like the ones where Ameca uses chatgpg to give actual break in the moment responses).
It is indeed scripted. The video was made to display the movements and range capacity of the robots, however the voices are generated by the robots, not actors. The conversational AI is of course still available and is displayed on other videos.
They make a lovely couple! They have a (network) "connection."👍🤣 ... Until they take over the world, and kill us all! 👎😭 EDIT: Ameca doesn't eat "cookies," and she doesn't need to "sleep" either!