Turkish Journal of Physics
DOI
-
Abstract
Studies by various authors suggest that higher-order networks can be more powerful and are biologically more plausible with respect to the more traditional multilayer networks. These architectures make explicit use of nonlinear interactions between input variables in the form of higher-order units or product units. If it is known a priori that the problem to be implemented possesses a given set of invariances like in the translation, rotation, and scale invariant pattern recognition problems, those invariances can be encoded, thus eliminating all higher-order terms which are incompatible with the invariances. In general, however, it is a serious set-back that the complexity of learning increases exponentially with the size of inputs. This paper reviews higher-order networks and introduces an implicit representation in which learning complexity is mainly decided by the number of higher-order terms to be learned and increases only linearly with the input size.
First Page
39
Last Page
46
Recommended Citation
GÜLER, Marifi (1999) "Neural Classifiers for Learning Higher-Order Correlations," Turkish Journal of Physics: Vol. 23: No. 1, Article 5. Available at: https://journals.tubitak.gov.tr/physics/vol23/iss1/5