•  
  •  
 

Turkish Journal of Electrical Engineering and Computer Sciences

DOI

10.3906/elk-2001-62

Abstract

In supervised classification, obtaining nonlinear separating functions from an algorithm is crucial for prediction accuracy. This paper analyzes the polyhedral conic functions (PCF) algorithm that generates nonlinear separating functions by only solving simple subproblems. Then, a revised version of the algorithm is developed that achieves better generalization and fast training while maintaining the simplicity and high prediction accuracy of the original PCF algorithm. This is accomplished by making the following modifications to the subproblem: extension of the objective function with a regularization term, relaxation of a hard constraint set and introduction of a new error term. Experimental results show that the modifications provide %12 better generalization on average and up to 10x faster training. This paper also contributes to the literature by providing detailed comparisons of the other classification algorithms that use polyhedral conic functions for the first time.

Keywords

Classification, conic functions, machine learning, optimization

First Page

2735

Last Page

2749

Share

COinS