- Smart Air Cooler with ESP8266Posted 23 hours ago
- Make a Music Reactive Light boxPosted 2 days ago
- Temperature Access Point Using an AVR® DA MicrocontrollerPosted 3 days ago
- Determining a Plant’s Health with TinyMLPosted 4 days ago
- Please welcome the Arduino IDE 2.0 (with debugger)Posted 4 days ago
- Antennino on Dashboard ThingsBoardPosted 5 days ago
- MQ3 Sensor With Arduino UNO(Breath Analyzer)Posted 2 weeks ago
- 433 MHz receiver 8 channels with self-learningPosted 2 weeks ago
- How to Make a Touch Sensor Circuit Using Transistors on BreadboardPosted 2 weeks ago
- How to Make a Mobile Phone Charger | DIY Simple Power BankPosted 2 weeks ago
Open Source Low-Cost, Portable and High-Performance Bionic Leg
A new open-source, artificially intelligent prosthetic leg designed by researchers at the University of Michigan and Shirley Ryan AbilityLab is now available to the scientific community. As an open-source project, anyone can contribute to improving the leg’s design and function.
Although the designs and code are free, the leg is still a high-end, state-of-the-art prosthetic, according to the researchers. It’s built around a plug-and-play architecture that will let scientists and biomedical engineers avoid research-and-development costs in the millions of dollars and immediately begin testing on prosthetics for the knee and ankle. It effectively lowers the barriers to entry for researchers.
Researchers who work directly with people with disabilities often have to build their own robotic legs. Instead of starting from scratch, researchers can take this common platform and, after some assembly, begin working on better ways to help people with mobility impairments. The common platform also lets engineers conduct direct comparisons of control algorithms, which researchers can then iterate and build upon.
The key to making it work, then, is AI. The Raspberry Pi-powered AI-based control uses a combination of muscle contraction signals and sensor data from within the bionic leg to guess what a user is going to do next, and responds accordingly.
