Global resolution of the support vector machine regression parameters selection problem with LPCC
Support vector machine regression is a robust data fitting method to minimize the sum of deducted residuals of regression, and thus is less sensitive to changes of data near the regression hyperplane. Two design parameters, the insensitive tube size (εe) and the weight assigned to the regression err...
Guardado en:
Autores principales: | , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Elsevier
2015
|
Materias: | |
Acceso en línea: | https://doaj.org/article/12eb93a85e024a6791ee2a350bf105c4 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Sumario: | Support vector machine regression is a robust data fitting method to minimize the sum of deducted residuals of regression, and thus is less sensitive to changes of data near the regression hyperplane. Two design parameters, the insensitive tube size (εe) and the weight assigned to the regression error trading off the normed support vector (Ce), are selected by user to gain better forecasts. The global training and validation parameter selection procedure for the support vector machine regression can be formulated as a bi-level optimization model, which is equivalently reformulated as linear program with linear complementarity constraints (LPCC). We propose a rectangle search global optimization algorithm to solve this LPCC. The algorithm exhausts the invariancy regions on the parameter plane ((Ce,εe)-plane) without explicitly identifying the edges of the regions. This algorithm is tested on synthetic and real-world support vector machine regression problems with up to hundreds of data points, and the efficiency are compared with several approaches. The obtained global optimal parameter is an important benchmark for every other selection of parameters. |
---|