A0721
Title: Variable and architecture selection in neural networks
Authors: Andrew McInerney - University of Limerick (Ireland) [presenting]
Kevin Burke - University of Limerick (Ireland)
Abstract: Feedforward neural networks can be viewed as non-linear regression models, where covariates enter the model through a combination of weighted summations and non-linear functions. Although these models have similarities to the models typically used in statistical modelling, the majority of neural network research has been conducted outside of the field of statistics. This has resulted in a lack of statistically-based methodology, and, in particular, there has been little emphasis on model parsimony. Determining the input layer structure is analogous to variable selection while determining the structure for the hidden layer(s) relates to model complexity. However, the calculation of an associated likelihood function opens the door to information-criteria-based variable and architecture selection. A novel top-down model selection method is proposed using the Bayesian information criterion for feedforward neural networks, wherein the optimal weights for one model are carried over to the next. Simulation studies are used to evaluate the performance of the proposed method, and an application on real data is investigated.