This paper proposes a novel sparse twin parametric insensitive support vector regression(STPISVR) model, designed to enhance sparsity and improve generalization performance.Similar to twin parametric insensitive support vector regression (TPISVR), STPISVR constructsa pair of nonparallel parametric insensitive bound functions to indirectly determinethe regression function. The optimization problems are reformulated as two sparse linearprogramming problems (LPPs), rather than traditional quadratic programming problems(QPPs). The two LPPs are originally derived from initial L1-norm regularization termsimposed on their respective dual variables, which are simplified to constants via theKarush–Kuhn–Tucker (KKT) conditions and consequently disappear. This simplificationreduces model complexity, while the constraints constructed through the KKT conditions—particularly their geometric properties—effectively ensure sparsity. Moreover, a two-stagehybrid tuning strategy—combining grid search for coarse parameter space exploration andBayesian optimization for fine-grained convergence—is proposed to precisely select theoptimal parameters, reducing tuning time and improving accuracy compared to a singlemethodstrategy. Experimental results on synthetic and benchmark datasets demonstratethat STPISVR significantly reduces the number of support vectors (SVs), thereby improvingprediction speed and achieving a favorable trade-off among prediction accuracy, sparsity,and computational efficiency. Overall, STPISVR enhances generalization ability, promotessparsity, and improves prediction efficiency, making it a competitive tool for regressiontasks, especially in handling complex data structures.
An Efficient Sparse Twin Parametric Insensitive Support Vector Regression Model
Qu S.;De Leone R.;
2025-01-01
Abstract
This paper proposes a novel sparse twin parametric insensitive support vector regression(STPISVR) model, designed to enhance sparsity and improve generalization performance.Similar to twin parametric insensitive support vector regression (TPISVR), STPISVR constructsa pair of nonparallel parametric insensitive bound functions to indirectly determinethe regression function. The optimization problems are reformulated as two sparse linearprogramming problems (LPPs), rather than traditional quadratic programming problems(QPPs). The two LPPs are originally derived from initial L1-norm regularization termsimposed on their respective dual variables, which are simplified to constants via theKarush–Kuhn–Tucker (KKT) conditions and consequently disappear. This simplificationreduces model complexity, while the constraints constructed through the KKT conditions—particularly their geometric properties—effectively ensure sparsity. Moreover, a two-stagehybrid tuning strategy—combining grid search for coarse parameter space exploration andBayesian optimization for fine-grained convergence—is proposed to precisely select theoptimal parameters, reducing tuning time and improving accuracy compared to a singlemethodstrategy. Experimental results on synthetic and benchmark datasets demonstratethat STPISVR significantly reduces the number of support vectors (SVs), thereby improvingprediction speed and achieving a favorable trade-off among prediction accuracy, sparsity,and computational efficiency. Overall, STPISVR enhances generalization ability, promotessparsity, and improves prediction efficiency, making it a competitive tool for regressiontasks, especially in handling complex data structures.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


