LSTM Classification under Changes in Sequences Order

Edgar Ek-Chacón,Erik Molino-Minero-Re

mexican international conference on artificial intelligence(2020)

引用 0|浏览0
暂无评分
摘要
Recurrent Neural Networks (RNNs) have been widely used for sequences analysis and classification. Generally, the sequences are a set of samples following a specific order, like a time-based process or a structured dataset. This type of neural network is very efficient for exploring sequences patterns and other relevant features highlighting temporal behavior and dependencies. This is accomplished because the information loops within the different stages of the network, and in this process, it remembers and tracks features at different segments of the data. In this work, we are interested in exploring how an RNN based on Long-Short Term Memory (LSTM) units behaves in a classification problem when the dataset of sequences are organized in different order and lengths. That is, the same information is presented to the network, but the order of the samples within the sequences and the length of the sequences are different in each experiment. In order to evaluate this effect, we used five datasets of grayscale images of 28 \\(\\times \\) 28 pixels (MNIST, MNIST-C, notMNIST, FashionMNIST, and Sign Language MNIST). For every experiment, we segmented the images in different sizes and orders and built a set of sequences consisting of vectors of pixels organized following three different rules, and on each case, we set the sequences to a specifically fixed length. The results bring to light that good accuracies can be achieved for different sequences configurations. We considered the 28 \\(\\times \\) 28 configuration as the baseline for reference. We found that this baseline generally leads to high accuracies, but for some datasets it is not the best one. We believe that this study may be useful for video tagging and for general image description.
更多
查看译文
关键词
Recurrent Neural Network, LSTM network, Sequences order
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要