The Journal of China Universities of Posts and Telecommunications ›› 2023, Vol. 30 ›› Issue (1): 1-16.doi: 10.19682/j.cnki.1005-8885.2023.2001

• Artificial intelligence •     Next Articles

Least squares twin support vector machine with asymmetric squared loss

Wu Qing, Li Feiyan, Zhang Hengchang, Fan Jiulun, Gao Xiaofeng   

  1. 1. School of Automation, Xi'an University of Posts and Telecommunications, Xi'an 710121, China
    2. School of Telecommunication and Information Engineering, Xi'an University of Posts and Telecommunications, Xi'an 710121, China
  • Received:2021-07-12 Revised:2022-11-22 Accepted:2023-02-13 Online:2023-02-28 Published:2023-02-28
  • Contact: Wu Qing, E-mail: xiyouwuq@126.com E-mail:xiyouwuq@126.com
  • Supported by:
    The authors thank the anonymous reviewers for their constructive comments and suggestions. This work was supported
    in part by the National Natural Science Foundation of China (51875457), Natural Science Foundation of Shaanxi Province of
    China (2021JQ-701), the Key Research Project of Shaanxi Province (2022GY-050, 2022GY-028) and Xi'an Science and
    Technology Plan Project (2020KJRC0109).

Abstract: For classification problems, the traditional least squares twin support vector machine (LSTSVM) generates two nonparallel hyperplanes directly by solving two systems of linear equations instead of a pair of quadratic programming problems (QPPs), which makes LSTSVM much faster than the original TSVM. But the standard LSTSVM adopting quadratic loss measured by the minimal distance is sensitive to noise and unstable to re-sampling. To overcome this problem, the expectile distance is taken into consideration to measure the margin between classes and LSTSVM with asymmetric squared loss (aLSTSVM) is proposed. Compared to the original LSTSVM with the quadratic loss, the proposed aLSTSVM not only has comparable computational accuracy, but also performs good properties such as noise insensitivity, scatter minimization and re-sampling stability. Numerical experiments on synthetic datasets, normally distributed clustered (NDC) datasets and University of California, Irvine (UCI) datasets with different noises confirm the great performance and validity of our proposed algorithm.

Key words: classification, least squares twin support vector machine, asymmetric loss, noise insensitivity

CLC Number: