Gated Complex Recurrent Neural Networks

In proceedings of Conference on Neural Information Processing Systems, 2018
 

Abstract

Complex number have long been favoured for digital signal processing, yet complex representations rarely appear in deep learning architectures. RNNs, widely used to process time series and sequence information, could greatly benefit from complex representations. We present a novel complex gate recurrent cell. When used together with norm-preserving state transition matrices, our complex gated RNN exhibits excellent stability and convergence properties. We demonstrate competitive performance of our complex gated RNN on the synthetic memory and adding task, as well as on the real-world task of human motion prediction.

Bibtex

@INPROCEEDINGS{wolter-2018-nips,
     author = {Wolter, Moritz and Yao, Angela},
      title = {Gated Complex Recurrent Neural Networks},
  booktitle = {Conference on Neural Information Processing Systems},
       year = {2018},
   abstract = {Complex number have long been favoured for digital signal processing, yet complex representations
               rarely appear in deep learning architectures. RNNs, widely used to process time series and sequence
               information, could greatly benefit from complex representations. We present a novel complex gate
               recurrent cell. When used together with norm-preserving state transition matrices, our complex gated
               RNN exhibits excellent stability and convergence properties. We demonstrate competitive performance
               of our complex gated RNN on the synthetic memory and adding task, as well as on the real-world task
               of human motion prediction.}
}