Author Topic: Robot ‘eyes’ aid people with profound motor impairments  (Read 51 times)

0 Members and 1 Guest are viewing this topic.

Offline Flavio58

Robot ‘eyes’ aid people with profound motor impairments
« Reply #1 on: March 26, 2019, 10:13:55 PM »
Advertisement
Robot ‘eyes’ aid people with profound motor impairments

"The system was very liberating to me, in that it enabled me to independently manipulate my environment for the first time since my stroke."
        robot picks up cup

An interface system with augmented reality technology could help people with profound motor impairments operate a humanoid robot to feed themselves and perform routine personal care tasks.


Those tasks might include feeding and performing routine personal care tasks such as scratching an itch and applying skin lotion.


The web-based interface displays a “robot’s eye view” of surroundings to help users interact with the world through the machine.


Described in PLOS ONE, the system could help make sophisticated robots more useful to people who don’t have experience operating complex robotic systems. Study participants interacted with the robot interface using standard assistive computer access technologies—such as eye trackers and head trackers—that they already used to control their personal computers.


The paper reports on two studies showing how such “robotic body surrogates”—which can perform tasks similar to those of humans—could improve the quality of life for users. The work could provide a foundation for developing faster and more capable assistive robots.


“Our results suggest that people with profound motor deficits can improve their quality of life using robotic body surrogates,” says first author Phillip Grice, a recent doctoral graduate of the Georgia Institute of Technology. “We have taken the first step toward making it possible for someone to purchase an appropriate type of robot, have it in their home, and derive real benefit from it.”


man uses robot to shave
Here, Henry Evans, a California man who helped researchers with improvements to a web-based interface, uses the robot to shave. (Credit: Henry Clever, Phillip Grice/Georgia Tech)

Water bottles and wash cloths


Grice and Charlie Kemp, professor in the biomedical engineering department at Georgia Tech and Emory University, used a PR2 mobile manipulator for the two studies. The wheeled robot has 20 degrees of freedom, with two arms and a “head,” giving it the ability to manipulate objects such as water bottles, washcloths, hairbrushes, and even an electric shaver.


“Our goal is to give people with limited use of their own bodies access to robotic bodies so they can interact with the world in new ways,” Kemp says.


In the first study, Grice and Kemp made the PR2 available across the internet to a group of 15 participants with severe motor impairments. The participants learned to control the robot remotely, using their own assistive equipment to operate a mouse cursor to perform a personal care task. Eighty percent of the participants could manipulate the robot to pick up a water bottle and bring it to the mouth of a mannequin.


“Compared to able-bodied persons, the capabilities of the robot are limited,” Grice says. “But the participants were able to perform tasks effectively and showed improvement on a clinical evaluation that measured their ability to manipulate objects compared to what they would have been able to do without the robot.”


This view through the PR2's cameras
This view through the PR2’s cameras shows the environment around the robot. Clicking the yellow disc allows users the control the arm. (Credit: Phillip Grice/Georgia Tech)

User empowerment


In the second study, the researchers gave the PR2 and interface system to Henry Evans, a California man who has been helping Georgia Tech researchers study and improve assistive robotic systems since 2011.


“…he found new opportunities for using it that we had not anticipated.”


Evans, who has very limited control of his body, tested the robot in his home for seven days and not only completed tasks, but also devised novel uses combining the operation of both robot arms at the same time—using one arm to control a washcloth and the other to use a brush.


“The system was very liberating to me, in that it enabled me to independently manipulate my environment for the first time since my stroke,” Evans says. “With respect to other people, I was thrilled to see Phil get overwhelmingly positive results when he objectively tested the system with 15 other people.”


The way Evans developed new uses for the robot, combining motion of the two arms in ways they had not expected pleased the researchers, Grice says.


“When we gave Henry free access to the robot for a week, he found new opportunities for using it that we had not anticipated. This is important because a lot of the assistive technology available today is designed for very specific purposes.


“What Henry has shown is that this system is powerful in providing assistance and empowering users. The opportunities for this are potentially very broad.”


Universal design


The interface allowed Evans to care for himself in bed over an extended period of time. “The most helpful aspect of the interface system was that I could operate the robot completely independently, with only small head movements using an extremely intuitive graphical user interface,” he says.


The web-based interface shows users what the world looks like from cameras located in the robot’s head. Clickable controls overlaid on the view allow the users to move the robot around in a home or other environment and control the robot’s hands and arms.


When users move the robot’s head, for instance, the screen displays the mouse cursor as a pair of eyeballs to show where the robot will look when the user clicks. Clicking on a disc surrounding the robotic hands allows users to select a motion. While driving the robot around a room, lines following the cursor on the interface indicate the direction it will travel.


Building the interface around the actions of a simple single-button mouse allows people with a range of disabilities to use the interface without lengthy training sessions.


“Having an interface that individuals with a wide range of physical impairments can operate means we can provide access to a broad range of people, a form of universal design,” Grice notes.


“Because of its capability, this is a very complex system, so the challenge we had to overcome was to make it accessible to individuals who have very limited control of their own bodies.”


Robot surrogates


While the results of the study demonstrated what the researchers had set out to do, Kemp agrees they can still make improvements. The existing system is slow, and user mistakes can create significant setbacks. Still, he says, “People could use this technology today and really benefit from it.”


The developers will need to make significant reductions in cost and size to make the PR2 commercially viable, Evans says. The studies point the way to a new type of assistive technology, Kemp adds.


“It seems plausible to me based on this study that robotic body surrogates could provide significant benefits to users.”


The National Institute on Disability, Independent Living and Rehabilitation Research, the National Science Foundation, and the Residential Care Facilities for the Elderly of Fulton County funded the work. Willow Garage made the robot.


Kemp is a cofounder, a board member, an equity holder, and the CTO of Hello Robot Inc., which develops products related to this research. This research could affect his personal financial status. Georgia Tech has reviewed and approved the terms of this arrangement in accordance with its conflict of interest policies.


Source: Georgia Tech


The post Robot ‘eyes’ aid people with profound motor impairments appeared first on Futurity.


Source: Robot ‘eyes’ aid people with profound motor impairments


Consulente in Informatica dal 1984

Software automazione, progettazione elettronica, computer vision, intelligenza artificiale, IoT, sicurezza informatica, tecnologie di sicurezza militare, SIGINT. 

Facebook:https://www.facebook.com/flaviobernardotti58
Twitter : https://www.twitter.com/Flavio58

Cell:  +39 366 3416556

f.bernardotti@deeplearningitalia.eu

#deeplearning #computervision #embeddedboard #iot #ai

 

Related Topics

  Subject / Started by Replies Last post
0 Replies
99 Views
Last post May 12, 2018, 01:02:26 AM
by Flavio58
0 Replies
48 Views
Last post August 14, 2018, 02:09:50 AM
by Flavio58
0 Replies
55 Views
Last post August 26, 2018, 08:02:33 AM
by Flavio58
0 Replies
63 Views
Last post November 03, 2018, 12:03:47 AM
by Flavio58
0 Replies
55 Views
Last post November 06, 2018, 12:03:39 AM
by Flavio58

Sitemap 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326