Functional form estimation using oblique projection matrices for ls-SVM regression models
Kernel regression models have been used as non-parametric methods for fitting experimental data. However, due to their non-parametric nature, they belong to the so-called 'black box' models, indicating that the relation between the input variables and the output, depending on the kernel se...
- Autores:
- Tipo de recurso:
- Fecha de publicación:
- 2019
- Institución:
- Universidad del Rosario
- Repositorio:
- Repositorio EdocUR - U. Rosario
- Idioma:
- eng
- OAI Identifier:
- oai:repository.urosario.edu.co:10336/22835
- Acceso en línea:
- https://doi.org/10.1371/journal.pone.0217967
https://repository.urosario.edu.co/handle/10336/22835
- Palabra clave:
- Analysis of variance
Article
Decomposition
Human
Least square analysis
Support vector machine
Algorithm
Artificial intelligence
Least square analysis
Machine learning
Statistical model
Support vector machine
Algorithms
Artificial intelligence
Least-squares analysis
Machine learning
Support vector machine
Statistical
Models
- Rights
- License
- Abierto (Texto Completo)
Summary: | Kernel regression models have been used as non-parametric methods for fitting experimental data. However, due to their non-parametric nature, they belong to the so-called 'black box' models, indicating that the relation between the input variables and the output, depending on the kernel selection, is unknown. In this paper we propose a new methodology to retrieve the relation between each input regressor variable and the output in a least squares support vector machine (LS-SVM) regression model. The method is based on oblique subspace projectors (ObSP), which allows to decouple the influence of input regressors on the output by including the undesired variables in the null space of the projection matrix. Such functional relations are represented by the nonlinear transformation of the input regressors, and their subspaces are estimated using appropriate kernel evaluations. We exploit the properties of ObSP in order to decompose the output of the obtained regression model as a sum of the partial nonlinear contributions and interaction effects of the input variables, we called this methodology Nonlinear ObSP (NObSP). We compare the performance of the proposed algorithm with the component selection and smooth operator (COSSO) for smoothing spline ANOVA models. We use as benchmark 2 toy examples and a real life regression model using the concrete strength dataset from the UCI machine learning repository. We showed that NObSP is able to outperform COSSO, producing stable estimations of the functional relations between the input regressors and the output, without the use of prior-knowledge. This methodology can be used in order to understand the functional relations between the inputs and the output in a regression model, retrieving the physical interpretation of the regression models. © 2019 Caicedo et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
---|