Title
Investigation of Japanese PnG BERT Language Model in Text-to-Speech Synthesis for Pitch Accent Language
Abstract
End-to-end text-to-speech synthesis (TTS) can generate highly natural synthetic speech from raw text. However, rendering the correct pitch accents is still a challenging problem for end-to-end TTS. To tackle the challenge of rendering correct pitch accent in Japanese end-to-end TTS, we adopt PnG BERT, a self-supervised pretrained model in the character and phoneme domain for TTS. We investigate the effects of features captured by PnG BERT on Japanese TTS by modifying the fine-tuning condition to determine the conditions helpful inferring pitch accents. We manipulate content of PnG BERT features from being text-oriented to speech-oriented by changing the number of fine-tuned layers during TTS. In addition, we teach PnG BERT pitch accent information by fine-tuning with tone prediction as an additional downstream task. Our experimental results show that the features of PnG BERT captured by pretraining contain information helpful inferring pitch accent, and PnG BERT outperforms baseline Tacotron on accent correctness in a listening test.
Year
DOI
Venue
2022
10.1109/JSTSP.2022.3190672
IEEE Journal of Selected Topics in Signal Processing
Keywords
DocType
Volume
PnG BERT,text-to-speech,Japanese,pitch accent,self-supervised learning
Journal
16
Issue
ISSN
Citations 
6
1932-4553
0
PageRank 
References 
Authors
0.34
9
2
Name
Order
Citations
PageRank
Yusuke Yasuda100.34
Tomoki Toda21874167.18