Turkish Journal of Electrical Engineering and Computer Sciences




In supervised classification, obtaining nonlinear separating functions from an algorithm is crucial for prediction accuracy. This paper analyzes the polyhedral conic functions (PCF) algorithm that generates nonlinear separating functions by only solving simple subproblems. Then, a revised version of the algorithm is developed that achieves better generalization and fast training while maintaining the simplicity and high prediction accuracy of the original PCF algorithm. This is accomplished by making the following modifications to the subproblem: extension of the objective function with a regularization term, relaxation of a hard constraint set and introduction of a new error term. Experimental results show that the modifications provide %12 better generalization on average and up to 10x faster training. This paper also contributes to the literature by providing detailed comparisons of the other classification algorithms that use polyhedral conic functions for the first time.


Classification, conic functions, machine learning, optimization

First Page


Last Page