Opposite transfer functions and backpropagation through time

Mario Ventresca, Hamid R. Tizhoosh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Backpropagation through time is a very popular discrete-time recurrent neural network training algorithm. However, the computational time associated with the learning process to achieve high accuracy is high. While many approaches have been proposed that alter the learning algorithm, this paper presents a computationally inexpensive method based on the concept of opposite transfer functions to improve learning in the backpropagation through time algorithm. Specifically, we will show an improvement in the accuracy, stability as well as an acceleration in learning time. We will utilize three common benchmarks to provide experimental evidence of the improvements.

Original languageEnglish (US)
Title of host publicationProceedings of the 2007 IEEE Symposium on Foundations of Computational Intelligence, FOCI 2007
Pages570-577
Number of pages8
DOIs
StatePublished - 2007
Event2007 IEEE Symposium on Foundations of Computational Intelligence, FOCI 2007 - Honolulu, HI, United States
Duration: Apr 1 2007Apr 5 2007

Publication series

NameProceedings of the 2007 IEEE Symposium on Foundations of Computational Intelligence, FOCI 2007

Conference

Conference2007 IEEE Symposium on Foundations of Computational Intelligence, FOCI 2007
Country/TerritoryUnited States
CityHonolulu, HI
Period4/1/074/5/07

Keywords

  • Backpropagation through time
  • Opposite transfer functions
  • Opposition-based learning

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • General Mathematics

Fingerprint

Dive into the research topics of 'Opposite transfer functions and backpropagation through time'. Together they form a unique fingerprint.

Cite this