%0 Conference Proceedings %T Data Fine-Pruning: A Simple Way to Accelerate Neural Network Training %+ University of Warwick [Coventry] %+ Shenzhen University %A Li, Junyu %A He, Ligang %A Ren, Shenyuan %A Mao, Rui %< avec comité de lecture %( Lecture Notes in Computer Science %B 15th IFIP International Conference on Network and Parallel Computing (NPC) %C Muroran, Japan %Y Feng Zhang %Y Jidong Zhai %Y Marc Snir %Y Hai Jin %Y Hironori Kasahara %Y Mateo Valero %I Springer International Publishing %3 Network and Parallel Computing %V LNCS-11276 %P 114-125 %8 2018-11-29 %D 2018 %R 10.1007/978-3-030-05677-3_10 %K Deep Neural Network %K Data pruning %K SGD %K Acceleration %Z Computer Science [cs]Conference papers %X The training process of a neural network is the most time-consuming procedure before being deployed to applications. In this paper, we investigate the loss trend of the training data during the training process. We find that given a fixed set of hyper-parameters, pruning specific types of training data can reduce the time consumption of the training process while maintaining the accuracy of the neural network. We developed a data fine-pruning approach, which can monitor and analyse the loss trend of training instances at real-time, and based on the analysis results, temporarily pruned specific instances during the training process basing on the analysis. Furthermore, we formulate the time consumption reduced by applying our data fine-pruning approach. Extensive experiments with different neural networks are conducted to verify the effectiveness of our method. The experimental results show that applying the data fine-pruning approach can reduce the training time by around 14.29% while maintaining the accuracy of the neural network. %G English %Z TC 10 %Z WG 10.3 %2 https://inria.hal.science/hal-02279554/document %2 https://inria.hal.science/hal-02279554/file/477597_1_En_10_Chapter.pdf %L hal-02279554 %U https://inria.hal.science/hal-02279554 %~ IFIP-LNCS %~ IFIP %~ IFIP-TC %~ IFIP-TC10 %~ IFIP-NPC %~ IFIP-WG10-3 %~ IFIP-LNCS-11276