[4Yin2-23] Extracting the Equation using Knowledge distillation of a Neural Network
Keywords:Deep Learning, Knowledge Distillation
Deep learning uses a multilayer neural network to model complex problems with a large number of parameters in a high dimension, which is difficult with conventional techniques. On the other hand, conventional equations such as physical laws have captured complex real-world phenomena by replacing them with simple equations that can be described by a small number of parameters. This is similar to the distillation framework of deep learning, in which a new model is learned to mimic a previously learned model. Based on this idea, the purpose of this paper is to investigate and confirm that what a multi-layer neural network has learned can be distilled into a smaller model consisting of a small number of parameters, which is similar to the equation structure of an existing equation, by dealing with a data set of equations, such as physical laws.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.