Mathematical Sciences Department Discrete Math Seminar, Nikhil Gangaram, WPI
12:00 pm
Title: Training a 3-Node Neural Network is NP-Complete
Abstract: We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold functions of their inputs. We show that it is NP-complete to decide whether there exist weights and thresholds for the three nodes of this network so that it will produce output consistent with a given set of training examples. We extend the result to other simple networks. This result suggests that those looking for perfect training algorithms cannot escape inherent computational difficulties just by considering only simple or very regular networks. It also suggests the importance, given a training problem, of finding an appropriate network and input encoding for that problem. It is left as an open problem to extend our result to nodes with non-linear functions such as sigmoids.
Friday, April 26th, 2024
12:00 PM – 12:50 PM
Olin Hall 109