%0 Conference Proceedings %T On the Invariance of the SELU Activation Function on Algorithm and Hyperparameter Selection in Neural Network Recommenders %+ University of the Aegean %A Sakketou, Flora %A Ampazis, Nicholas %Z Part 13: Recommendation Systems %< avec comité de lecture %( IFIP Advances in Information and Communication Technology %B 15th IFIP International Conference on Artificial Intelligence Applications and Innovations (AIAI) %C Hersonissos, Greece %Y John MacIntyre %Y Ilias Maglogiannis %Y Lazaros Iliadis %Y Elias Pimenidis %I Springer International Publishing %3 Artificial Intelligence Applications and Innovations %V AICT-559 %P 673-685 %8 2019-05-24 %D 2019 %R 10.1007/978-3-030-19823-7_56 %K Recommender systems %K Neural networks %K Activation functions %Z Computer Science [cs]Conference papers %X In a number of recent studies the Scaled Exponential Linear Unit (SELU) activation function has been shown to automatically regularize network parameters and to make learning robust due to its self-normalizing properties. In this paper we explore the utilization of SELU in training different neural network architectures for recommender systems and validate that it indeed outperforms other activation functions for these types of problems. More interestingly however, we show that SELU also exhibits performance invariance with regards to the selection of the optimization algorithm and its corresponding hyperparameters. This is clearly demonstrated by a number of experiments which involve a number of activation functions and optimization algorithms for training different neural network architectures on standard recommender systems benchmark datasets. %G English %Z TC 12 %Z WG 12.5 %2 https://inria.hal.science/hal-02331305/document %2 https://inria.hal.science/hal-02331305/file/483292_1_En_56_Chapter.pdf %L hal-02331305 %U https://inria.hal.science/hal-02331305 %~ IFIP %~ IFIP-AICT %~ IFIP-TC %~ IFIP-WG %~ IFIP-TC12 %~ IFIP-AIAI %~ IFIP-WG12-5 %~ IFIP-AICT-559