State-Frequency Memory Recurrent Neural Networks

Abstract

Modeling temporal sequences plays a fundamental role in various modern applications and has drawn more and more attentions in the machine learning community. Among those efforts on improving the capability to represent temporal data, the Long Short-Term Memory (LSTM) has achieved great success in many areas. Although the LSTM can capture long-range dependency in the time domain, it does not explicitly model the pattern occurrences in the frequency domain that plays an important role in tracking and predicting data points over various time cycles. We propose the State-Frequency Memory (SFM), a novel recurrent architecture that allows to separate dynamic patterns across different frequency components and their impacts on modeling the temporal contexts of input sequences. By jointly decomposing memorized dynamics into state-frequency components, the SFM is able to offer a fine-grained analysis of temporal sequences by capturing the dependency of uncovered patterns in both time and frequency domains. Evaluations on several temporal modeling tasks demonstrate the SFM can yield competitive performances, in particular as compared with the state-of-the-art LSTM models.

Publication Date

1-1-2017

Publication Title

34th International Conference on Machine Learning, ICML 2017

Volume

4

Number of Pages

2482-2491

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

Socpus ID

85048428910 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/85048428910

This document is currently not available here.

Share

COinS