Neural Network Surgery: Injecting Data Patterns into Pre-trained Models with Minimal Instance-wise Side Effects

Abstract

Side effects during neural network tuning are typically measured by overall accuracy changes. However, we find that even with similar overall accuracy, existing tuning methods result in non-negligible instance-wise side effects. Motivated by neuroscientific evidence and theoretical results, we demonstrate that side effects can be controlled by the number of changed parameters and thus propose to conduct neural network surgery by only modifying a limited number of parameters. Neural network surgery can be realized using diverse techniques, and we investigate three lines of methods. Experimental results on representative tuning problems validate the effectiveness of the surgery approach. The dynamic selecting method achieves the best overall performance that not only satisfies the tuning goal but also induces fewer instance-wise side effects by changing only 10^-5 of the parameters.

Publication
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics, NAACL 2021 (to appear)
Xuancheng Ren
Xuancheng Ren

My research interests include distributed robotics, mobile computing and programmable matter.