Turkish Journal of Electrical Engineering and Computer Sciences




In one-class classification (OCC) tasks, only the target class (class-of-interest (CoI)) samples are well defined during training, whereas the other class samples are totally absent. In OCC algorithms, the high dimensional data adds computational overhead apart from its intrinsic property of curse of dimensionality. For target class learning, conventional dimensionality reduction (DR) techniques are not suitable due to negligence of the unique statistical properties of CoI samples. In this context, the present research proposes a novel target class guided DR technique to extract the eigen knowledge grid that contains the most promising eigenvectors of variance-covariance matrix of CoI samples. In this process the lower and higher eigenvalued eigenvectors are rejected via statistical analysis because the high variance may split the target class itself, whereas the lower variance do not contribute significant information. Furthermore, the identified eigen knowledge grid is utilized to transform high dimensional samples to the lower dimensional eigen subspace. The proposed approach is named as learning target class eigen subspace (LTS-ES) that ensures strong separation of the target class from other classes. To show the effectiveness of transformed lower dimensional eigen subspace, one class support vector machine (OCSVM) has been experimented on wide variety of benchmark datasets in presence of: original feature space, transformed features obtained via eigenvectors of approximately 80%-90% cumulative variance, transformed features obtained via knowledge grid and transformed features obtained via eigenvectors of approximately 50% cumulative variance. Finally, a new performance measure parameter called stability factor is introduced to validate the robustness of the proposed approach.


One-class classification, target class, dimensionality reduction, class-of-interest, eigen knowledge grid, one-class support vector machine

First Page


Last Page