Тёмный

An armband to control prosthetic hands 

UC Berkeley
Подписаться 63 тыс.
Просмотров 9 тыс.
50% 1

UC Berkeley researchers have created a new device that combines wearable biosensors with artificial intelligence software to help recognize what hand gesture a person intends to make based on electrical signal patterns in the forearm. The device paves the way for better prosthetic control and seamless interaction with electronic devices.
Story excerpts:
“Prosthetics are one important application of this technology, but besides that, it also offers a very intuitive way of communicating with computers.” said Ali Moin, who helped design the device as a doctoral student in UC Berkeley’s Department of Electrical Engineering and Computer Sciences. “Reading hand gestures is one way of improving human-computer interaction. And, while there are other ways of doing that, by, for instance, using cameras and computer vision, this is a good solution that also maintains an individual’s privacy.”
...
Moin is co-first author of a new paper describing the device, which appeared online Dec. 21, 2020 in the journal Nature Electronics.
...
Andy Zhou is co-first author of this paper. Other authors include Abbas Rahimi, Alisha Menon, George Alexandrov, Senam Tamakloe, Jonathan Ting, Natasha Yamamoto, Yasser Khan and Fred Burghardt of UC Berkeley; Simone Benatti of the University of Bologna; and Luca Benini of ETH Zürich and the University of Bologna.
...
“When Amazon or Apple creates their algorithms, they run a bunch of software in the cloud that creates the model, and then the model gets downloaded onto your device,” said Jan Rabaey, the Donald O. Pedersen Distinguished Professor of Electrical Engineering at UC Berkeley and senior author of the paper. “The problem is that then you’re stuck with that particular model. In our approach, we implemented a process where the learning is done on the device itself. And it is extremely quick: You only have to do it one time, and it starts doing the job. But if you do it more times, it can get better. So, it is continuously learning, which is how humans do it.”
...
While the device is not ready to be a commercial product yet, Rabaey said that it could likely get there with a few tweaks.
...
This work was supported, in part, by the CONIX Research Center, one of six centers in JUMP, a Semiconductor Research Corporation (SRC) program sponsored by the U.S. Department of Defense’s Defense Advanced Research Projects Agency (DARPA). The work is also based, in part, on research sponsored by the Air Force Research Laboratory under agreement number FA8650-15-2-5401, as conducted through the Flexible Hybrid Electronics Manufacturing Innovation Institute, NextFlex. Additional support was received from sponsors of the Berkeley Wireless Research Center; the National Science Foundation Graduate Research Fellowship, under grant number 1106400; the ETH Zurich Postdoctoral Fellowship program and the Marie Sklodowska-Curie Actions for People COFUND program.
...
For full story, visit: news.berkeley....
Video courtesy Nature journals and the Rabaey Lab
news.berkeley.edu/
/ ucberkeley
/ ucberkeley
/ ucberkeleyofficial

Опубликовано:

 

2 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 6   
@brainstormingsharing1309
@brainstormingsharing1309 3 года назад
Absolutely well done and definitely keep it up!!! 👍👍👍👍👍
@TheChristmasCreeper
@TheChristmasCreeper 3 года назад
I wonder if you could get this to work by thinking about doing a gesture without actually doing it? For instance in real life you could be laying in bed wearing a VR headset, while you think about moving your arms and legs without actually moving them. The device picks up the electrical signals in your arms and legs, and lets you move in a virtual environment without having to do it in real life.
@UltimateRobotics
@UltimateRobotics 3 года назад
EMG doesn't work before some signal reaches muscles. But with high enough signal quality, it's possible to recognize activity when muscle barely moves - so instead of full gesture, slight movement can be picked up. But that is really sensitive to noise - especially to noise caused by changes in contact properties - so while it's definitely possible in ideal conditions, it might not work well in real life scenario (you can check videos on our channel, there are some muscle activity visualizations which might be interesting in this aspect)
@MikeTrieu
@MikeTrieu 3 месяца назад
​@@UltimateRobotics It might be a bit more uncomfortable, but I saw a demonstration of "microblading" where someone embedded artificial eyebrows into their hand. I wonder if a similar technique could be applied to electrodes to obtain a cleaner signal bypassing the epidermis? I'd be concerned with infection, tho.
@UltimateRobotics
@UltimateRobotics 3 месяца назад
@@MikeTrieu definitely, and the most advanced systems implant electrodes deep into the arm, right on the muscle surface, with the whole device placed under the skin, using wireless charging and IR led for data transfer. But such approach adds a lot of medical complexity - if not done right, consequences would be quite unpleasant
@bigidigi
@bigidigi 3 года назад
Amazing! Do you have open source github repo to contribute?
Далее
HA-HA-HA-HA 👫 #countryhumans
00:15
Просмотров 565 тыс.
Improving the control of prosthetic hands
2:42
Просмотров 31 тыс.
Signal Classification to Control Robotic Hand
2:40
Просмотров 51 тыс.
This MIT Engineer Built His Own Bionic Leg
10:03
Просмотров 4,2 млн
Mind-controlled prosthetic
2:07
Просмотров 59 тыс.
This fungus could transform our food system
3:38
Просмотров 1,9 тыс.
An Armband to Control Prosthetic Hands
1:53
Просмотров 1,9 тыс.