- Anglický jazyk
Deterministic Nonmonotone Training
Autor: Chun-Cheng Peng
In this book, we propose novel deterministic RNN training algorithms that adopt a nonmonotone approach. This allows learning behaviour to deteriorate in some iterations; nevertheless the overall learning performance is improved over time. The nonmonotone... Viac o knihe
Na objednávku
73.17 €
bežná cena: 81.30 €
O knihe
In this book, we propose novel deterministic RNN training algorithms that adopt a nonmonotone approach. This allows learning behaviour to deteriorate in some iterations; nevertheless the overall learning performance is improved over time. The nonmonotone RNN training methods, which take their theoretical basis from the theory of deterministic nonlinear optimisation, aim at better exploring the search space and enhancing the convergence behaviour of gradient-based methods. They generate nonmonotone behaviour by incorporating conditions that employ forcing functions, which are used to measure the sufficiency of error reduction, and an adaptive window, whose size is informed by estimating the morphology of the error surface locally. The thesis develops nonmonotone 1st- and 2nd-order methods and discusses their convergence properties. The proposed algorithms are applied to training RNNs of various sizes and architectures, namely Feed-Forward Time-Delay networks, Elman Networks and Nonlinear Autoregressive Networks with Exogenous Inputs Networks, in symbolic sequence processing problems. Numerical results show that the proposed nonmonotone learning algorithms train more effectively.
- Vydavateľstvo: LAP LAMBERT Academic Publishing
- Rok vydania: 2011
- Formát: Paperback
- Rozmer: 220 x 150 mm
- Jazyk: Anglický jazyk
- ISBN: 9783846599532