113

arXiv:2504.04371v2 Announce Type: replace
Abstract: Traditional Support Vector Machine (SVM) classification is carried out by finding the max-margin classifier for the training data that divides the mar-gin space into two equal sub-spaces. This study demonstrates limitations of performing Support Vector Classification in non-Euclidean spaces by estab-lishing that the underlying principle of max-margin classification and Karush Kuhn Tucker (KKT) boundary conditions are valid only in the Eu-clidean vector spaces, while in non-Euclidean spaces the principle of maxi-mum margin is a function of intra-class data covariance. The study estab-lishes a methodology to perform Support Vector Classification in Non-Euclidean Spaces by incorporating data covariance into the optimization problem using the transformation matrix obtained from Cholesky Decompo-sition of respective class covariance matrices, and shows that the resulting classifier obtained separates the margin space in ratio of respective class pop-ulation covariance. The study proposes an algorithm to iteratively estimate the population covariance-adjusted SVM classifier in non-Euclidean space from sample covariance matrices of the training data. The effectiveness of this SVM classification approach is demonstrated by applying the classifier on multiple datasets and comparing the performance with traditional SVM kernels and whitening algorithms. The Cholesky-SVM model shows marked improvement in the accuracy, precision, F1 scores and ROC performance compared to linear and other kernel SVMs.
Be respectful and constructive. Comments are moderated.

No comments yet.