Online Access
http://arxiv.org/abs/1708.06004Abstract
We review Boltzmann machines extended for time-series. These models often have recurrent structure, and back propagration through time (BPTT) is used to learn their parameters. The per-step computational complexity of BPTT in online learning, however, grows linearly with respect to the length of preceding time-series (i.e., learning rule is not local in time), which limits the applicability of BPTT in online learning. We then review dynamic Boltzmann machines (DyBMs), whose learning rule is local in time. DyBM's learning rule relates to spike-timing dependent plasticity (STDP), which has been postulated and experimentally confirmed for biological neural networks.Comment: 32 pages. The topics covered in this paper are presented in Part III of IJCAI-17 tutorial on energy-based machine learning. https://researcher.watson.ibm.com/researcher/view_group.php?id=7834
Date
2017-08-20Type
textIdentifier
oai:arXiv.org:1708.06004http://arxiv.org/abs/1708.06004