PUBLICATION
Many-Speakers Single Channel Speech Separation with Optimal Permutation Training
Interspeech
August 29, 2021
By: Shaked Dovrat, Eliya Nachmani, Lior Wolf
Abstract
Single channel speech separation has experienced great progress in the last few years. However, training neural speech separation for a large number of speakers (e.g., more than 10 speakers) is out of reach for the current methods, which rely on the Permutation Invariant Training (PIT). In this work, we present a permutation invariant training that employs the Hungarian algorithm in order to train with an O (C 3) time complexity, where C is the number of speakers, in comparison to O(C !) of PIT based methods. Furthermore, we present a modified architecture that can handle the increased number of speakers. Our approach separates up to 20 speakers and improves the previous results for large C by a wide margin.
Download Paper
Areas
ARTIFICIAL INTELLIGENCE
NATURAL LANGUAGE PROCESSING & SPEECH
Share
Related Publications
ICML - July 18, 2021
Latency-Aware Neural Architecture Search with Multi-Objective Bayesian Optimization
David Eriksson, Pierce I-Jen Chuang, Samuel Daulton, Peng Xia, Akshat Shrivastava, Arun Babu, Shicong Zhao, Ahmed Aly, Ganesh Venkatesh, Maximilian Balandat
ICML - July 18, 2021
Variational Auto-Regressive Gaussian Processes for Continual Learning
Sanyam Kapoor, Theofanis Karaletsos, Thang D. Bui
AKBC - October 3, 2021
Relation Prediction as an Auxiliary Training Objective for Improving Multi-Relational Graph Representations
Yihong Chen, Pasquale Minervini, Sebastian Riedel, Pontus Stenetorp
ICCV - October 11, 2021
Contrast and Classify: Training Robust VQA Models
Yash Kant, Abhinav Moudgil, Dhruv Batra, Devi Parikh, Harsh Agrawal
All Publications
Additional Resources
Videos
Downloads & Projects
Visiting Researchers & Postdocs
Visit Our Other Blogs
Engineering
Facebook AI
Oculus
Tech@
RSS Feed
About
Careers
Privacy
Cookies
Terms
Help
Facebook © 2021
To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy