par Gaudioso, Manlio;Gorgone, Enrico ;Labbé, Martine ;Rodriguez Chia, Antonio Manuel
Référence Computers & operations research, 87, page (137-145)
Publication Publié, 2017-11
Article révisé par les pairs
Résumé : We discuss a Lagrangian-relaxation-based heuristics for dealing with feature selection in the Support Vector Machine (SVM) framework for binary classification. In particular we embed into our objective function a weighted combination of the L1 and L0 norm of the normal to the separating hyperplane. We come out with a Mixed Binary Linear Programming problem which is suitable for a Lagrangian relaxation approach. Based on a property of the optimal multiplier setting, we apply a consolidated nonsmooth optimization ascent algorithm to solve the resulting Lagrangian dual. In the proposed approach we get, at every ascent step, both a lower bound on the optimal solution as well as a feasible solution at low computational cost. We present the results of our numerical experiments on some benchmark datasets.