Title
NaLa-Search: A multimodal, interaction-based architecture for faceted search on linked open data
Abstract
Mobile devices are the technological basis of computational intelligent systems, yet traditional mobile application interfaces tend to rely only on the touch modality. That said, such interfaces could improve human-computer interaction by combining diverse interaction modalities, such as visual, auditory and touch. Also, a lot of information on the Web is published under the Linked Data principles to allow people and computers to share, use and/or reuse high-quality information; however, current tools for searching for, browsing and visualising this kind of data are not fully developed. The goal of this research is to propose a novel architecture called NaLa-Search to effectively explore the Linked Open Data cloud. We present a mobile application that combines voice commands and touch for browsing and searching for such semantic information through faceted search, which is a widely used interaction scheme for exploratory search that is faithful to its richness and practical for real-world use. NaLa-Search was evaluated by real users from the clinical pharmacology domain. In this evaluation, the users had to search and navigate among the DrugBank dataset through voice commands. The evaluation results show that faceted search combined with multiple interaction modalities (e.g. speech and touch) can enhance users' interaction with semantic knowledge bases.
Year
DOI
Venue
2021
10.1177/0165551520930918
JOURNAL OF INFORMATION SCIENCE
Keywords
DocType
Volume
Faceted search, linked open data, voice command recognition
Journal
47
Issue
ISSN
Citations 
6
0165-5515
0
PageRank 
References 
Authors
0.34
0
5