Self-Attention Based Network For Punctuation Restoration

2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR)(2018)

引用 20|浏览13
暂无评分
摘要
Inserting proper punctuation into Automatic Speech Recognizer(ASR) transcription is a challenging and promising task in real-time Spoken Language Translation(SLT). Traditional methods built on the sequence labelling framework are weak in handling the joint punctuation. To tackle this problem, we propose a novel self-attention based network, which can solve the aforementioned problem very well. In this work, a light-weight neural net is proposed to extract the hidden features based solely on self-attention without any Recurrent Neural Nets(RNN) and Convolutional Neural Nets(CNN). We conduct extensive experiments on complex punctuation tasks. The experimental results show that the proposed model achieves significant improvements on joint punctuation task while being superior to traditional methods on simple punctuation task as well.
更多
查看译文
关键词
Automatic Speech Recognizer transcription,real-time Spoken Language Translation,self-attention based network,features extraction,light-weight neural net,sequence labelling framework,proper punctuation,punctuation restoration
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要