TRIZ, a Systematic Approach to Create Quantum Activation Function for Deep Learning’s Hidden Layers, in Order to Make AI Explainable with Quantum Computer
Abstract
Artificial Intelligence (AI)’s market is growing very fast all over the world, along with Deep Learning (DL) technologies requiring more and more data and speed to process them, in healthcare, agriculture, automotive, security, and in among several other industries. However, despite this AI rapid and increasing market expansion, there are numerous related challenges need to be tackled. In this paper, the purpose is to show through another structured approach derived from ARIZ algorithm and some principles coined by TrizStartup movement [1], some of the problems deep learning industries are trying to handle. This paper shows how TRIZ can be applied to fix one of those AI problems. In fact, due to DL neural networks (NN) hidden layers, their outputs are not reliable. If an error occurs, how could human reproduce the issue? Human are not able to identify activated neurons to analyse root causes of this malfunction. Therefore, reliability issue from hidden layers versus rapid AI market skyrocket, data and speed processing, generates innovation problems. In order to resolve this, principle 35-parameter changes with Althshuler Matrix, little people, Su-Field analysis, seventy-six standards solutions, have been used to generate quantum functions. The result described in this paper is a new function called QuantumReLU (QReLU), created with Quantum computer in order to extend classical activation function ReLU. It is then possible to fire neuron with activation function QReLU, and to use quantum states to identify activated neurons within hidden layers. Thus, Triz systematic approach led switching neural networks algorithms from classical to quantum computer, and therefore to build deep learning NN on quantum computer, based on the new QReLU activation function.
Domains
Computer Science [cs]Origin | Files produced by the author(s) |
---|
Loading...