Abstract The problem of selecting the best sparse linear regression model is considered a relevant optimization problem. Usually this problem is solved through a sequence of separate steps, alternating between choosing a subset of features and then finding a best fit regression. In this seminar a new approach, based on a mixed-integer nonlinear optimization problem, will be presented. The proposed method has the advantage of considering both model selection as well as parameter estimation as a single optimization problem. Numerical experiments confirm the high quality of this approach, both in terms of the quality of the resulting models and in terms of CPU time.