This smart prosthetic ankle adjusts to rough terrain

This smart prosthetic ankle adjusts to rough terrain
Prosthetic limbs are getting better and more personalized, but useful as they are, they’re still a far cry from the real thing. This new prosthetic ankle is a little closer than others, though: it moves on its own, adapting to its user’s gait and the surface on which it lands.
Your ankle does a lot of work when you walk: lifting your toe out of the way so you don’t scuff it on the ground, controlling the tilt of your foot to minimize the shock when it lands or as you adjust your weight, all while conforming to bumps and other irregularities it encounters. Few prostheses attempt to replicate these motions, meaning all that work is done in a more basic way, like the bending of a spring or compression of padding.
But this prototype ankle from Michael Goldfarb, a mechanical engineering professor at Vanderbilt, goes much further than passive shock absorption. Inside the joint are a motor and actuator, controlled by a chip that senses and classifies motion and determines how each step should look.

Po 3D prints personalized prosthetic hands for the needy in South America

“This device first and foremost adapts to what’s around it,” Goldfarb said in a video documenting the prosthesis.
“You can walk up slopes, down slopes, up stairs and down stairs, and the device figures out what you’re doing and functions the way it should,” he added in a news release from the university.
When it senses that the foot has lifted up for a step, it can lift the toe up to keep it clear, also exposing the heel so that when the limb comes down, it can roll into the next step. And by reading the pressure both from above (indicating how the person is using that foot) and below (indicating the slope and irregularities of the surface) it can make that step feel much more like a natural one.

One veteran of many prostheses, Mike Sasser, tested the device and had good things to say: “I’ve tried hydraulic ankles that had no sort of microprocessors, and they’ve been clunky, heavy and unforgiving for an active person. This isn’t that.”
Right now the device is still very lab-bound, and it runs on wired power — not exactly convenient if someone wants to go for a walk. But if the joint works as designed, as it certainly seems to, then powering it is a secondary issue. The plan is to commercialize the prosthesis in the next couple of years once all that is figured out. You can learn a bit more about Goldfarb’s research at the Center for Intelligent Mechatronics.

Source: Gadgets – techcrunch

HoloLens acts as eyes for blind users and guides them with audio prompts

HoloLens acts as eyes for blind users and guides them with audio prompts
Microsoft’s HoloLens has an impressive ability to quickly sense its surroundings, but limiting it to displaying emails or game characters on them would show a lack of creativity. New research shows that it works quite well as a visual prosthesis for the vision impaired, not relaying actual visual data but guiding them in real time with audio cues and instructions.
The researchers, from Caltech and University of Southern California, first argue that restoring vision is at present simply not a realistic goal, but that replacing the perception portion of vision isn’t necessary to replicate the practical portion. After all, if you can tell where a chair is, you don’t need to see it to avoid it, right?
Crunching visual data and producing a map of high-level features like walls, obstacles and doors is one of the core capabilities of the HoloLens, so the team decided to let it do its thing and recreate the environment for the user from these extracted features.
They designed the system around sound, naturally. Every major object and feature can tell the user where it is, either via voice or sound. Walls, for instance, hiss (presumably a white noise, not a snake hiss) as the user approaches them. And the user can scan the scene, with objects announcing themselves from left to right from the direction in which they are located. A single object can be selected and will repeat its callout to help the user find it.
That’s all well for stationary tasks like finding your cane or the couch in a friend’s house. But the system also works in motion.
The team recruited seven blind people to test it out. They were given a brief intro but no training, and then asked to accomplish a variety of tasks. The users could reliably locate and point to objects from audio cues, and were able to find a chair in a room in a fraction of the time they normally would, and avoid obstacles easily as well.
This render shows the actual paths taken by the users in the navigation tests
Then they were tasked with navigating from the entrance of a building to a room on the second floor by following the headset’s instructions. A “virtual guide” repeatedly says “follow me” from an apparent distance of a few feet ahead, while also warning when stairs were coming, where handrails were and when the user had gone off course.
All seven users got to their destinations on the first try, and much more quickly than if they had had to proceed normally with no navigation. One subject, the paper notes, said “That was fun! When can I get one?”
Microsoft actually looked into something like this years ago, but the hardware just wasn’t there — HoloLens changes that. Even though it is clearly intended for use by sighted people, its capabilities naturally fill the requirements for a visual prosthesis like the one described here.
Interestingly, the researchers point out that this type of system was also predicted more than 30 years ago, long before they were even close to possible:
“I strongly believe that we should take a more sophisticated approach, utilizing the power of artificial intelligence for processing large amounts of detailed visual information in order to substitute for the missing functions of the eye and much of the visual pre-processing performed by the brain,” wrote the clearly far-sighted C.C. Collins way back in 1985.
The potential for a system like this is huge, but this is just a prototype. As systems like HoloLens get lighter and more powerful, they’ll go from lab-bound oddities to everyday items — one can imagine the front desk at a hotel or mall stocking a few to give to vision-impaired folks who need to find their room or a certain store.
“By this point we expect that the reader already has proposals in mind for enhancing the cognitive prosthesis,” they write. “A hardware/software platform is now available to rapidly implement those ideas and test them with human subjects. We hope that this will inspire developments to enhance perception for both blind and sighted people, using augmented auditory reality to communicate things that we cannot see.”

Source: Gadgets – techcrunch

Microsoft’s Xbox Adaptive Controller is an inspiring example of inclusive design

Microsoft’s Xbox Adaptive Controller is an inspiring example of inclusive design
Every gamer with a disability faces a unique challenge for many reasons, one of which is the relative dearth of accessibility-focused peripherals for consoles. Microsoft is taking a big step toward fixing this with its Xbox Adaptive Controller, a device created to address the needs of gamers for whom ordinary gamepads aren’t an option.
The XAC, revealed officially at a recent event but also leaked a few days ago, is essentially a pair of gigantic programmable buttons and an oversized directional pad; 3.5mm ports on the back let a huge variety of assistive devices like blow tubes, pedals and Microsoft-made accessories plug in.
It’s not meant to be an all-in-one solution by any means, more like a hub that allows gamers with disabilities to easily make and adjust their own setups with a minimum of hassle. Whatever you’re capable of, whatever’s comfortable, whatever gear you already have, the XAC is meant to enable it.
I’d go into detail, but it would be impossible to do better than Microsoft’s extremely interesting and in-depth post introducing the XAC, which goes into the origins of the hardware, the personal stories of the testers and creators and much more. Absolutely worth taking the time to read.
I look forward to hearing more about the system and how its users put it to use, and I’m glad to see inclusivity and accessibility being pursued in such a practical and carefully researched manner.

Source: Gadgets – techcrunch

Get the latest TC stories read to you over the phone with BrailleVoice

Get the latest TC stories read to you over the phone with BrailleVoice

For the visually impaired, there are lots of accessibility options if you want to browse the web — screen readers, podcast versions of articles and so on. But it can still be a pain to keep up with your favorite publications the way sighted app users do. BrailleVoice is a project that puts the news in a touch-tone phone interface, reading you the latest news from your favorite publications (like this one) easily from anywhere you get a signal.

It’s from SpaceNext, AKA Shan, who has a variety of useful little apps he’s developed over the years on his page — John wrote up one back in 2011. Several of them have an accessibility aspect to them, something that always piques my interest.

“Visually challenged users will find it difficult to navigate using apps,” he wrote in an email. “I thought with text to speech readily available… they would be able to make a call to a toll free number to listen to latest news from any site.”

All you do is dial 1-888-666-4013, then listen to the options on the menu. TechCrunch is the first outlet listed, so hit 1# and it’ll read out the headlines. Select one (of mine) and it’ll jump right in. That’s it! There are a couple of dozen sites listed right now, from LifeHacker (hit 15#) to the Times of India (hit 26#). You can also suggest new sites to add, presumably as long as they have some kind of RSS feed. (This should be a reminder why you should keep your website or news service accessible in some like manner.)

“More importantly,” he continued, “this works even without internet even in the remotest of places. You can listen to your favorite news site without having to spend a dime or worry about internet.”

Assuming you can get a voice signal and you’ve got minutes, anyway. I quite like the idea of someone walking into the nearest town, pulling out their old Nokia, dialing this up and keeping up to date with the most news-addicted of us.

The text to speech engine is pretty rudimentary, but it’s better than what we all had a few years back, and it’ll only get better as improved engines like Google’s and Apple’s trickle down for general purpose use. I’m going to ask them about that, actually.

It’s quite a basic service, but what more does it need to have, really? Shan is planning to integrate voice controls into the likes of Google Home and Alexa, so there’s that. But as is it may be enough to provide plenty of utility to the vision-impaired. Check out TextOnly too. I could use that for desktop.

Source: Mobile – Techcruch