Abstract
In this paper, we propose a new cost function, step loss, for support vector machine classifiers based on a deep distinction between the instances. It takes into account the position of the samples with the margin. More precisely, we divide the instances into four categories: i) instances correctly classified and lies outside the margin, ii) instances well classified and lies within the margin, iii) instances misclassified and lies within the margin and iv) instances misclassified and lies outside the margin. The the step loss assign a constant cost for each group of instances. By this it is more general than the hard margin cost that divide the instances into two categories. It will be also more robust to the outliers than the soft margin because the instances of the fourth group have a constant cost contrary to the hinge cost where the misclassified instances have a linear cost. It will be more accurate than the Ramp loss because it hardly distinguishes between the instances well classified within the margin and the instances misclassified within the margin. Theoretically, we prove that SVM model integrated with the step loss function has has the nice property of kernilization.
| Original language | English |
|---|---|
| Pages (from-to) | 9-15 |
| Number of pages | 7 |
| Journal | Procedia Computer Science |
| Volume | 141 |
| DOIs | |
| Publication status | Published - 2018 |
| Externally published | Yes |
| Event | 9th International Conference on Emerging Ubiquitous Systems and Pervasive Networks, EUSPN 2018 - Leuven, Belgium Duration: 5 Nov 2018 → 8 Nov 2018 |
Keywords
- Classification
- Integer programming
- Loss function
- Machine learning
- SVM
Fingerprint
Dive into the research topics of 'A step loss function based SVM classifier for binary classification'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver