Title
Separating Long-Form Speech with Group-wise Permutation Invariant Training.
Abstract
Multi-talker conversational speech processing has drawn many interests for various applications such as meeting transcription. Speech separation is often required to handle overlapped speech that is commonly observed in conversation. Although the existing utterancelevel permutation invariant training-based continuous speech separation approach has proven to be effective in various conditions, it lacks the ability to leverage the long-span relationship of utterances and is computationally inefficient due to the highly overlapped sliding windows. To overcome these drawbacks, we propose a novel training scheme named Group-PIT, which allows direct training of the speech separation models on the long-form speech with a low computational cost for label assignment. Two different speech separation approaches with Group-PIT are explored, including direct long-span speech separation and short-span speech separation with long-span tracking. The experiments on the simulated meeting-style data demonstrate the effectiveness of our proposed approaches, especially in dealing with a very long speech input.
Year
DOI
Venue
2022
10.21437/Interspeech.2022-10362
Conference of the International Speech Communication Association (INTERSPEECH)
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
11
Name
Order
Citations
PageRank
Wangyou Zhang1125.44
Zhuo Chen215324.33
Naoyuki Kanda301.35
Shujie Liu433837.84
Jinyu Li591572.84
Sefik Emre Eskimez601.69
Takuya Yoshioka758549.20
Xiong Xiao828134.97
Zhong Meng93314.95
Yanmin Qian1029544.44
Furu Wei111956107.57