Abstract | ||
---|---|---|
Due to the evolution of motion capture devices, natural user interfaces have been applied in several areas, such as neuromotor rehabilitation supported by virtual environments. This paper presents a smartphone application that allows the user to interact with the virtual environment and enables the captured data to be stored, processed, and used in machine learning models. The application submits the recordings to the remote database with information about the movement and in order to apply supervised machine learning. As a proof of concept, we generated a dataset capturing movement data using our application with 232 instances divided into 8 classes of movements. Moreover, we used this dataset for training models that classifies these movements. The remarkable accuracy of the models shows the feasibility of using body articulation data for a classification task after some data transformations. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1145/3470482.3479618 | PROCEEDINGS OF THE 27TH BRAZILIAN SYMPOSIUM ON MULTIMEDIA AND THE WEB (WEBMEDIA '21) |
Keywords | DocType | Citations |
Computer vision, motion capture, augmented reality, supervised machine learning | Conference | 0 |
PageRank | References | Authors |
0.34 | 0 | 7 |
Name | Order | Citations | PageRank |
---|---|---|---|
Luis G. S. Rodrigues | 1 | 0 | 0.34 |
Diego Roberto Colombo Dias | 2 | 16 | 6.10 |
Marcelo de Paiva Guimarães | 3 | 40 | 14.89 |
Alexandre Fonseca Brandão | 4 | 0 | 0.34 |
Leonardo C. Dutra da Rocha | 5 | 0 | 0.34 |
Rogério Luiz Iope | 6 | 0 | 0.68 |
José Remo Ferreira Brega | 7 | 23 | 7.91 |