References
[1] WANG E, WANG Z Y, WU Q, et al. One novel class of bezier smooth semi-supervised support vector machines for classification.
Neural Computing and Applications, 2021, 33: 9975 -9991.
[2] TIAN Y J, JU X C, QI Z Q, et al. Improved twin support vector machine. Science China: Mathematics, 2014, 57 (2): 417 -
432.
[3] JOACHIMS T. Making large-scale SVM learning practical. SCHOLKOPF B, BURGES C J C, SMOLAA J (eds). Advances
in Kernel Methods: Support Vector Learning. Cambridge, MA, USA: MIT Press, 1999: 169 -184.
[4] FAN R E, CHANG K W, HSIEH C J, et al. LIBLINEAR: A library for large linear classification. The Journal of Machine
Learning Research, 2008, 9: 1871 -1874.
[5] LI G Q, YANG L X, WU Z Y, et al. D. C. programming for sparse proximal support vector machines. Information Sciences,
2021, 547: 187 -201.
[6] TIAN Z D. Backtracking search optimization algorithm-based least square support vector machine and its applications. Engineering Applications of Artificial Intelligence, 2020, 94: Article 103801.
[7] GANAIE M A, TANVEER M. LSTSVM classifier with enhanced features from pre-trained functional link network. Applied Soft
Computing, 2020, 93: Article 106305.
[8] AN R, XU Y T, LIU X H. A rough margin-based multi-task v-twin support vector machine for pattern classification. Applied Soft
Computing, 2021, 112: Article 107769.
[9] YANG Z J, XU Y T. Laplacian twin parametric-margin support vector machine for semi-supervised classification. Neurocomputing, 2016, 171: 325 -334.
[10] RASTOGI R, SHARMA S, CHANDRA S. Robust parametric twin support vector machine for pattern classification. Neural
Processing Letters, 2018, 47(1): 293 -323.
[11] GANAIE M A, TANVEER M. Fuzzy least squares projection twin support vector machines for class imbalance learning.
Applied Soft Computing, 2021, 113(Part B): Article 107933.
[12] TANG L, TIAN Y J, YANG C Y, et al. Ramp-loss nonparallel support vector regression: Robust, sparse and scalable
approximation. Knowledge-Based Systems, 2018, 147: 55 -67.
[13] JIMENEZ-CASTAO C, ALVAREZ-MEZA A, OROZCO-GUTIERREZ A. Enhanced automatic twin support vector machine for imbalanced data classification. Pattern Recognition, 2020, 107: Article 107442.
[14] CHEN W J, SHAO Y H, LI C N, et al. v-projection twin support vector machine for pattern classification. Neurocomputing, 2020, 376: 10 -24.
[15] MELLO A R, STEMMER M R, KOERICH A L. Incremental and decremental fuzzy bounded twin support vector machine.
Information Sciences, 2020, 526: 20 -38.
[16] DING S F, HUA X P. Recursive least squares projection twin support vector machines. Neurocomputing, 2014, 130: 3 -9.
[17] LUO J R, QIAO H, ZHANG B. Learning with smooth hinge losses. Neurocomputing, 2021, 463: 379 -387.
[18] LU L Y, LIN Q, PEI H M, et al. The aLS-SVM based multi-task learning classifiers. Applied Intelligence, 2018, 48 (8):
2393 -2407.
[19] HUANG X L, SHI L, SUYKENS J A K. Sequential minimal optimization for SVM with pinball loss. Neurocomputing, 2015,
149(Part C): 1596 -1603.
[20] TANVEER M, SHARMA A, SUGANTHAN P N. General twin support vector machine with pinball loss function. Information
Sciences, 2019, 494(C): 311 -327.
[21] ANAND P, RASTOGI R, CHANDRA S. A new asymmetric e-insensitive pinball loss function based support vector quantile
regression model. Applied Soft Computing, 2020, 94: Article 106473.
[22] HUANG X L, SHI L, SUYKENS J A K. Asymmetric least squares support vector machine classifiers. Computational Statistics and Data Analysis, 2014, 70(2): 395 -405.
[23] WANG H R, ZHANG Q. Twin K-class support vector classification with pinball loss. Applied Soft Computing, 2021, 113(Part A): Article 107929.
[24] YE Y F, SHAO Y H, LI C N, et al. Online support vector quantile regression for the dynamic time series with heavy-tailed
noise. Applied Soft Computing, 2021, 110 ( Part C): Article 107560.
[25] OVERSTALL A M. Properties of Fisher information gain for Bayesian design of experiments. Journal of Statistical Planning
and Inference, 2022, 218: 138 -146.
[26] ARLOT S, CELISSE A. A survey of cross-validation procedures for model selection. Statistics Surveys, 2010, 4: 40 -79. |