%0 Conference Proceedings %T QIM: Quantifying Hyperparameter Importance for Deep Learning %+ Beihang University (BUAA) %+ Shenzhen Institute of Advanced Technology [Shenzhen] (SIAT) %A Jia, Dan %A Wang, Rui %A Xu, Chengzhong %A Yu, Zhibin %Z Part 5: Data Processing and Big Data %< avec comité de lecture %( Lecture Notes in Computer Science %B 13th IFIP International Conference on Network and Parallel Computing (NPC) %C Xi'an, China %Y Guang R. Gao %Y Depei Qian %Y Xinbo Gao %Y Barbara Chapman %Y Wenguang Chen %I Springer International Publishing %3 Network and Parallel Computing %V LNCS-9966 %P 180-188 %8 2016-10-28 %D 2016 %R 10.1007/978-3-319-47099-3_15 %K Deep learning %K Plackett-burman design %K Hyperparameter %Z Computer Science [cs]Conference papers %X Recently, Deep Learning (DL) has become super hot because it achieves breakthroughs in many areas such as image processing and face identification. The performance of DL models critically depend on hyperparameter settings. However, existing approaches that quantify the importance of these hyperparameters are time-consuming.In this paper, we propose a fast approach to quantify the importance of the DL hyperparameters, called QIM. It leverages Plackett-Burman design to collect as few as possible data but can still correctly quantify the hyperparameter importance. We conducted experiments on the popular deep learning framework – Caffe – with different datasets to evaluate QIM. The results show that QIM can rank the importance of the DL hyperparameters correctly with very low cost. %G English %Z TC 10 %Z WG 10.3 %2 https://inria.hal.science/hal-01648007/document %2 https://inria.hal.science/hal-01648007/file/432484_1_En_15_Chapter.pdf %L hal-01648007 %U https://inria.hal.science/hal-01648007 %~ IFIP-LNCS %~ IFIP %~ IFIP-TC %~ IFIP-TC10 %~ IFIP-NPC %~ IFIP-WG10-3 %~ IFIP-LNCS-9966