Title
[email protected]: An Interactive Multimodal Lifelog Retrieval with Query-to-Sample Attention-based Search Engine
Abstract
In this paper, we introduce an interactive multimodal lifelog retrieval system with the search engine built by utilizing the attention mechanism. The algorithm upon which the system relies is constructed by applying two observations: (1) most of the images belonged to one event probably contain cues (e.g., objects) that relate to the content of queries. These cues contribute to the representative of the event, and (2) instances of one event can be associated with the content and context of such an event. Hence, when we can determine the seed (by leveraging the first observation), we can find all relevant instances (by utilizing the second observation). We also take a benefit of querying by samples (e.g., images) by converting text query to images using the attention-based mechanism. Thus, we can enrich and add more semantic meaning into the simple text query of users towards having more accurate results, as well as discovering hidden results that cannot reach by using only text queries. The system is designed for both novice and expert users with several filters to help users express their queries from general to particular descriptions and to polish their results.
Year
DOI
Venue
2020
10.1145/3379172.3391722
ICMR '20: International Conference on Multimedia Retrieval Dublin Ireland June, 2020
DocType
ISBN
Citations 
Conference
978-1-4503-7136-0
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Anh-Vu Mai-Nguyen110.69
Trong-Dat Phan212.38
Anh-Khoa Vo311.70
Van-Luon Tran410.69
Minh-son Dao59321.42
Koji Zettsu621239.07