Title
SUPERB-SG: Enhanced Speech processing Universal PERformance Benchmark for Semantic and Generative Capabilities
Abstract
Transfer learning has proven to be crucial in advancing the state of speech and natural language processing research in recent years. In speech, a model pre-trained by self-supervised learning transfers remarkably well on multiple tasks. However, the lack of a consistent evaluation methodology is limiting towards a holistic understanding of the efficacy of such models. SUPERB was a step towards introducing a common benchmark to evaluate pre-trained models across various speech tasks. In this paper, we introduce SUPERB-SG, a new benchmark focused on evaluating the semantic and generative capabilities of pre-trained models by increasing task diversity and difficulty over SUPERB. We use a lightweight methodology to test the robustness of representations learned by pre-trained models under shifts in data domain and quality across different types of tasks. It entails freezing pre-trained model parameters, only using simple task-specific trainable heads. The goal is to be inclusive of all researchers, and encourage efficient use of computational resources. We also show that the task diversity of SUPERB-SG coupled with limited task supervision is an effective recipe for evaluating the generalizability of model representation.
Year
DOI
Venue
2022
10.18653/v1/2022.acl-long.580
PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS)
DocType
Volume
Citations 
Conference
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
2
PageRank 
References 
Authors
0.35
0
17
Name
Order
Citations
PageRank
Hsiang-Sheng Tsai120.35
Heng-Jui Chang220.35
Wen-Chin Huang320.35
Zili Huang4175.47
Kushal Lakhotia5151.02
Shu-wen Yang6171.38
Shuyan Dong7162.08
T. Liu8349.67
Cheng-I Jeff Lai9151.02
Jiatong Shi10151.02
Xuankai Chang11244.34
Phil Hall1220.35
Hsuan-Jui Chen1320.69
Shang-Wen Li14152.71
Shinji Watanabe151158139.38
Abdel-rahman Mohamed163772266.13
Hung-Yi Lee1721745.30