Chrome Extension
WeChat Mini Program
Use on ChatGLM

Supervised Sequence Labelling with Recurrent Neural Networks

Studies in Computational Intelligence(2012)

Univ Toronto

Cited 32881|Views1
Abstract
Supervised sequence labelling is a vital area of machine learning, encompassing tasks such as speech, handwriting and gesture recognition, protein secondary structure prediction and part-of-speech tag
More
Translated text
Key words
Support Vector Machines,Subcellular Localization,Machine Learning
PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Related Papers

Local Feature Based Online Mode Detection with Recurrent Neural Networks

2012 International Conference on Frontiers in Handwriting Recognition 2012

被引用32

Stabilize Sequence Learning with Recurrent Neural Networks by Forced Alignment

2013 12th International Conference on Document Analysis and Recognition 2013

被引用10

Temporal and Situational Context Modeling for Improved Dominance Recognition in Meetings

13TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2012 (INTERSPEECH 2012), VOLS 1-3 2012

被引用24

Multi View Facial Action Unit Detection Based on CNN and BLSTM-RNN.

IEEE International Conference on Automatic Face & Gesture Recognition 2017

被引用52

Multi-Granularity Neural Sentence Model for Measuring Short Text Similarity

International Conference on Database Systems for Advanced Applications 2017

被引用18

Review of Deep Learning-Based Semantic Segmentation

Laser & Optoelectronics Progress 2019

被引用22

Efficiently Solving the Practical Vehicle Routing Problem: A Novel Joint Learning Approach

KDD '20 PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING 2020

被引用100

A Fault Diagnosis Method for Flexible Converter Valve Equipment Based on DSC-BIGRU-MA

Jianbao Guo, Hang Liu, Lei Feng,Lifeng Zu
FRONTIERS IN ENERGY RESEARCH 2024

被引用0

Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper

要点】:本文探讨了使用循环神经网络(RNN)进行监督序列标注的方法,并在多个任务中展示了其优越性能。

方法】:作者提出了一种基于RNN的监督学习框架,通过调整网络结构来适应不同的序列标注任务。

实验】:作者在多个标准数据集上进行了实验,包括语音识别、手写识别、手势识别以及蛋白质二级结构预测等,实验结果表明了所提方法的有效性。