Тёмный
IEEE Transactions on Robotics
IEEE Transactions on Robotics
IEEE Transactions on Robotics
Подписаться
The IEEE Transactions on Robotics (T-RO) publishes research papers that represent major advances in the state-of-the-art in all areas of robotics. The Transactions welcomes original papers that report on any combination of theory, design, experimental studies, analysis, algorithms, and integration and application case studies involving all aspects of robotics. For more information visit www.ieee-ras.org/publications/t-ro.
Learning Modular Robot Control Policies
1:59
10 месяцев назад
Комментарии
@Bruce_comics
@Bruce_comics 26 дней назад
That's cool
@draggador
@draggador 2 месяца назад
I've been waiting for non-invasive robo-prosthetics & neuro-prosthetics to be developed since i first learnt about robo-prosthetics & neuro-prosthetics around half a decade ago as a part of a university course assignment. Finally, it happened.
@mrvalveras
@mrvalveras 9 месяцев назад
Trans people get exo's? The world is not fair!
@sirave6017
@sirave6017 10 месяцев назад
Okay, jesus christ. 10 years out of college with only a bachelors in materials science and experience in aerospace engineering metallographic inspection analysis and the medical industry...I think what you are trying to say...in a nut shell is. You've been able to develop a robot arm that flexes like a spine/worm/snake in all directions as smoothly as possible? It appears like you're using these certain 'wounded string motors' from what I can tell to pull on different joints to manipulate it in practical/functional ways at certain times. What does 'pose error' mean? May I ask what the actuation tendons and passive strings are made of? Those are not also nitinol correct? Anyways, very cool, I understand if you can't answer some of my questions, but would love to get a response.
@harry1010
@harry1010 10 месяцев назад
I love the wide variety of actuators for these control policy expressions! Just a bunch of lil’ guys having a lil’ stroll
@alansteyrbach6926
@alansteyrbach6926 10 месяцев назад
Amazing!
@pradeepchaudhary5996
@pradeepchaudhary5996 Год назад
Great watch And inspiring
@bobsmithy3103
@bobsmithy3103 Год назад
it's kind neat, though I do wonder what scenarios this would be preferable over other existing methods
@user-qg1rk1wu5r
@user-qg1rk1wu5r Год назад
Wow!!! It's great!
@Rick1234567S
@Rick1234567S Год назад
Want more information? So like a transformer you need solar panels on its back that unfold and it lays down flat when it needs to recharge then it gets up and continues to go as long as you want. Will it die like a turtle or a sheep if it falls on its back when it is losing power? So then you need failsafe and that could be a small battery or solar panels on its knees. So what it takes longer to power up and turn over. You won;t always have a signal to it so you need to pre program some behaviors, text you with its satellite phone technology. Give you its coordinates. So some moves that you want to teach it might be to zig zag, avoid gunfire. So then it might trigger that by sound, recognizable sound. That activates the zig zag play. Maybe it can release a 6 inch drone have eyes in the air watch for moving things, maybe zoom in and identify them by looking at the geometry. Maybe infrared where all you need is a floppy and put that as a filter and you will see infrared. It doesn't have to be expensive. A range finder, will tell your robot a lot. These are like laser measuring tapes used in construction. A beam hits the wall gives you a distance between walls. Not expensive. Tesla cars work like that. 3D scanners are similar. So you combine 3D beams with 2 D bitmap technology. Keeping it simple. A small program, use Delphi 7, you will get code that is 1 megabyte. Don't use Windows 11 it will give you code that is one gigabyte.
@Rick1234567S
@Rick1234567S Год назад
Robotics is all about vision. And identifying objects. And that is programmed by using bit maps and comparing them to other things like facial recognition. Like self driving cars. You could put three stripes on a pole and give it meaning to a car, or robot or communication system, all through visuals. You can teach through similar means and record what the robot does and then play it back. So depending on your access to a database like facial recognition and the speed of identification which in the subway trains is very advanced and very fast, requiring a huge infrastructure. A tesla car uses a lot of processing power. So recording movements as plays, with real time optioning, is a light weight system for small robotics. You know the terrain is important. So in a building is easy. In a parking lot is easy. Rough terrain not so easy. In the Chat GPT does it pass the Turing test? You see you don't need to pass the Turing test. You merely need to be able to talk about some subject matter that you have. If you create a farmer website and hook that up to the weather feed, and have him say things that are based on the weather, that a farmer would say, he is responding to random events. It looks like he is aware and conscious. If you don;t try to talk to him. He might say, gee it's hot today, too hot if you ask me. Not good for crops. He might say rain again today. Too darn wet if you ask me. All that rain is sure to hurt the rhubarb. You can make them as human as you want with garbage in garbage out or expect the expected. A farmer is like this so make him look and act like one. Do the same for your robot and it will always appear successful. Able to meet objectives.