Speedup of Network Training Process by Eliminating the Overshoots of Outputs
Abstract
The overshoots between the expected and actual outputs while training network will slow down the training speed and affect the training accuracy. In this paper, an improved training method for eliminating overshoots is proposed on the basis of traditional network training algorithms and a suggestion of eliminating overshoot is given. Gradient descent is regarded as the training criterion in traditional methods which neglects the side effects caused by overshoots. The overshoot definition (OD) is combined with gradient descent. According to the overshoot suggestion, local linearization and weighted mean methods are used to adjust the parameters of network. Based on the new training strategy, a numerical experiment is conducted to verify the proposed algorithm. The results show that the proposed algorithm eliminates overshoots effectively and improves the training performance of the network greatly.
Origin | Files produced by the author(s) |
---|
Loading...