Large datasets: a mixed method to adapt and improve their learning by neural networks used in regression contexts
Abstract
The purpose of this work is to further study the relevance of accelerating the Monte-Carlo calculations for the gamma rays external radiotherapy through feed-forward neural networks. We have previously presented a parallel incremental algorithm that builds neural networks of reduced size, while providing high quality approximations of the dose deposit~\cite{Vecpar08b}. Our parallel algorithm consists in an optimized decomposition of the initial learning dataset (also called learning domain) in as much subsets as available processors. However, although that decomposition provides subsets of similar signal complexities, their sizes may be quite different, still implying potential differences in their learning times. This paper presents an efficient data extraction allowing a good and balanced training without any loss of signal information. As will be shown, the resulting irregular decomposition permits an important improvement in the learning time of the global network.
Origin | Files produced by the author(s) |
---|
Loading...