This paper proposes a novel sparse twin parametric insensitive support vector regression(STPISVR) model, designed to enhance sparsity and improve generalization performance.Similar to twin parametric insensitive support vector regression (TPISVR), STPISVR constructsa pair of nonparallel parametric insensitive bound functions to indirectly determinethe regression function. The optimization problems are reformulated as two sparse linearprogramming problems (LPPs), rather than traditional quadratic programming problems(QPPs). The two LPPs are originally derived from initial L1-norm regularization termsimposed on their respective dual variables, which are simplified to constants via theKarush–Kuhn–Tucker (KKT) conditions and consequently disappear. This simplificationreduces model complexity, while the constraints constructed through the KKT conditions—particularly their geometric properties—effectively ensure sparsity. Moreover, a two-stagehybrid tuning strategy—combining grid search for coarse parameter space exploration andBayesian optimization for fine-grained convergence—is proposed to precisely select theoptimal parameters, reducing tuning time and improving accuracy compared to a singlemethodstrategy. Experimental results on synthetic and benchmark datasets demonstratethat STPISVR significantly reduces the number of support vectors (SVs), thereby improvingprediction speed and achieving a favorable trade-off among prediction accuracy, sparsity,and computational efficiency. Overall, STPISVR enhances generalization ability, promotessparsity, and improves prediction efficiency, making it a competitive tool for regressiontasks, especially in handling complex data structures.

An Efficient Sparse Twin Parametric Insensitive Support Vector Regression Model

Qu S.;De Leone R.;
2025-01-01

Abstract

This paper proposes a novel sparse twin parametric insensitive support vector regression(STPISVR) model, designed to enhance sparsity and improve generalization performance.Similar to twin parametric insensitive support vector regression (TPISVR), STPISVR constructsa pair of nonparallel parametric insensitive bound functions to indirectly determinethe regression function. The optimization problems are reformulated as two sparse linearprogramming problems (LPPs), rather than traditional quadratic programming problems(QPPs). The two LPPs are originally derived from initial L1-norm regularization termsimposed on their respective dual variables, which are simplified to constants via theKarush–Kuhn–Tucker (KKT) conditions and consequently disappear. This simplificationreduces model complexity, while the constraints constructed through the KKT conditions—particularly their geometric properties—effectively ensure sparsity. Moreover, a two-stagehybrid tuning strategy—combining grid search for coarse parameter space exploration andBayesian optimization for fine-grained convergence—is proposed to precisely select theoptimal parameters, reducing tuning time and improving accuracy compared to a singlemethodstrategy. Experimental results on synthetic and benchmark datasets demonstratethat STPISVR significantly reduces the number of support vectors (SVs), thereby improvingprediction speed and achieving a favorable trade-off among prediction accuracy, sparsity,and computational efficiency. Overall, STPISVR enhances generalization ability, promotessparsity, and improves prediction efficiency, making it a competitive tool for regressiontasks, especially in handling complex data structures.
2025
generalization performance
Karush–Kuhn–Tucker (KKT) conditions
linear programming problem (LPP)
prediction speed
sparse twin parametric insensitive support vector regression (STPISVR)
sparsity
262
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11581/496904
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
social impact