Contractive rectifier networks for nonlinear maximum margin classification
MetadataShow full item record
© 2015 IEEE. To find the optimal nonlinear separating boundary with maximum margin in the input data space, this paper proposes Contractive Rectifier Networks (CRNs), wherein the hidden-layer transformations are restricted to be contraction mappings. The contractive constraints ensure that the achieved separating margin in the input space is larger than or equal to the separating margin in the output layer. The training of the proposed CRNs is formulated as a linear support vector machine (SVM) in the output layer, combined with two or more contractive hidden layers. Effective algorithms have been proposed to address the optimization challenges arising from contraction constraints. Experimental results on MNIST, CIFAR-10, CIFAR-100 and MIT-67 datasets demonstrate that the proposed contractive rectifier networks consistently outperform their conventional unconstrained rectifier network counterparts.
Showing items related by title, author, creator and subject.
An, Senjian; Boussaid, F.; Bennamoun, M. (2015)This paper investigates how hidden layers of deep rectifier networks are capable of transforming two or more pattern sets to be linearly separable while preserving the distances with a guaranteed degree, and proves the ...
An, Senjian; Ke, Q.; Bennamoun, M.; Boussaid, F.; Sohel, F. (2015)© Springer International Publishing Switzerland 2015. In this paper we introduce sign constrained rectifier networks (SCRN), demonstrate their universal classification power and illustrate their applications to pattern ...
Lau, M.; Lim, Hann (2017)© 2017 IEEE. Deep Belief Network (DBN) is made up of stacked Restricted Boltzmann Machine layers associated with global weight fine-tuning for pattern recognition. However, DBN suffers from vanishing gradient problem due ...