Software Symposium Proceedins
Online ISSN : 2758-8572
[volume title in Japanese]
Conference information

Testing-based formal verification for ReLU-based neural networks
*Haiyi Liu*Shaoying Liu*Ai Liu
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Pages 44-50

Details
Abstract
Since neural networks are widely used, how to ensure the reliability of neural networks has become a hot research topic. The first difficulty is that it is difficult to give the pre-condition and post-condition for neural networks, and the second difficulty also exists in traditional software, i.e., the problem of exploding execution paths. Fortunately, the output range of a neural network is easier to give and when an input is given, we can obtain the activation order of neurons in the neural network. Based on the above facts, we propose DeepTBFV, a method for pre-trained neural networks, which uses a testing-based formal verification algorithm to derive the pre-condition of the neural network on a specified path by post-condition. The purpose is to verify and explain the behavior of the neural network.
Content from these authors
© 2023 Author
Previous article Next article
feedback
Top