QIM: Quantifying Hyperparameter Importance for Deep Learning - Network and Parallel Computing (NPC 2016)
Conference Papers Year : 2016

QIM: Quantifying Hyperparameter Importance for Deep Learning

Abstract

Recently, Deep Learning (DL) has become super hot because it achieves breakthroughs in many areas such as image processing and face identification. The performance of DL models critically depend on hyperparameter settings. However, existing approaches that quantify the importance of these hyperparameters are time-consuming.In this paper, we propose a fast approach to quantify the importance of the DL hyperparameters, called QIM. It leverages Plackett-Burman design to collect as few as possible data but can still correctly quantify the hyperparameter importance. We conducted experiments on the popular deep learning framework – Caffe – with different datasets to evaluate QIM. The results show that QIM can rank the importance of the DL hyperparameters correctly with very low cost.
Fichier principal
Vignette du fichier
432484_1_En_15_Chapter.pdf (794.28 Ko) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-01648007 , version 1 (24-11-2017)

Licence

Identifiers

Cite

Dan Jia, Rui Wang, Chengzhong Xu, Zhibin Yu. QIM: Quantifying Hyperparameter Importance for Deep Learning. 13th IFIP International Conference on Network and Parallel Computing (NPC), Oct 2016, Xi'an, China. pp.180-188, ⟨10.1007/978-3-319-47099-3_15⟩. ⟨hal-01648007⟩
179 View
255 Download

Altmetric

Share

More