Title
RadarCat: Radar Categorization for Input & Interaction.
Abstract
In RadarCat we present a small, versatile radar-based system for material and object classification which enables new forms of everyday proximate interaction with digital devices. We demonstrate that we can train and classify different types of materials and objects which we can then recognize in real time. Based on established research designs, we report on the results of three studies, first with 26 materials (including complex composite objects), next with 16 transparent materials (with different thickness and varying dyes) and finally 10 body parts from 6 participants. Both leave one-out and 10-fold cross-validation demonstrate that our approach of classification of radar signals using random forest classifier is robust and accurate. We further demonstrate four working examples including a physical object dictionary, painting and photo editing application, body shortcuts and automatic refill based on RadarCat. We conclude with a discussion of our results, limitations and outline future directions.
Year
DOI
Venue
2016
10.1145/2984511.2984515
UIST
Keywords
Field
DocType
Radar, Context-Aware Interaction, Machine Learning, Material Classification, Object Recognition, Ubiquitous Computing
Computer science,Human–computer interaction,Artificial intelligence,Ubiquitous computing,Random forest,Photo editing,Radar,Categorization,Computer vision,Radar signals,Material classification,Multimedia,Cognitive neuroscience of visual object recognition
Conference
Citations 
PageRank 
References 
20
1.34
22
Authors
5
Name
Order
Citations
PageRank
Hui-Shyong Yeo113214.53
Gergely Flamich2201.34
Patrick Schrempf3201.34
David Harris-Birtill4283.30
A. Quigley584684.08