Abstract | ||
---|---|---|
Assistive living has gained increased focus in recent years with the increase in elderly population. This has led to a desire for technical solutions to reduce cost. Learning to perform human activities of daily living through the use of assistive technology (especially assistive robots) becomes more important in areas like elderly care. This paper proposes an approach to learning to perform human activities using a method of activity recognition from information obtained from an RGB-D sensor. Key features obtained from clustering and classification of relevant aspects of an activity will be used for learning. Existing approaches to activity recognition still have limitations preventing them from going mainstream. This is part of a project directed towards transfer learning of human activities to enhance human-robot interaction. For test and validation of our method, the CAD-60 human activity data set is used. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1145/3056540.3076197 | PETRA |
Keywords | Field | DocType |
Activity recognition, Assistive robotics, Ambient assisted living, Feature extraction | Population,Activities of daily living,Activity recognition,Simulation,Computer science,Transfer of learning,Feature extraction,Human–computer interaction,Artificial intelligence,Cluster analysis,Robot,Robotics | Conference |
Citations | PageRank | References |
2 | 0.38 | 16 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
David Ada Adama | 1 | 6 | 2.13 |
Lofti A. Zadeh | 2 | 14527 | 3847.07 |
Caroline S. Langensiepen | 3 | 35 | 14.29 |
Kevin Lee | 4 | 37 | 3.39 |
Pedro Trindade | 5 | 31 | 3.85 |