Title
Production federated keyword spotting via distillation, filtering, and joint federated-centralized training
Abstract
We trained a keyword spotting model using federated learning on real user devices and observed significant improvements when the model was deployed for inference on phones. To compensate for data domains that are missing from on-device training caches, we employed joint federated-centralized training. And to learn in the absence of curated labels on-device, we formulated a confidence filtering strategy based on user-feedback signals for federated distillation. These techniques created models that significantly improved quality metrics in offline evaluations and user-experience metrics in live A/B experiments.
Year
DOI
Venue
2022
10.21437/INTERSPEECH.2022-11050
Conference of the International Speech Communication Association (INTERSPEECH)
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
12
Name
Order
Citations
PageRank
Andrew Hard1110.90
Kurt Partridge2203.80
Neng Chen300.34
Sean Augenstein400.68
Aishanee Shah500.34
Hyun Jin Park610.68
Alex Park700.68
Sara Ng801.01
Jessica Nguyen900.34
Ignacio Lopez-Moreno1018714.97
Rajiv Mathews11125.30
Françoise Beaufays1262.82