Curtin University Homepage
  • Library
  • Help
    • Admin

    espace - Curtin’s institutional repository

    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    • espace Home
    • espace
    • Curtin Research Publications
    • View Item
    • espace Home
    • espace
    • Curtin Research Publications
    • View Item

    Stochastic diagonal approximate greatest descent in convolutional neural networks

    Access Status
    Fulltext not available
    Authors
    Tan, H.
    Lim, Hann
    Harno, H.
    Date
    2017
    Type
    Conference Paper
    
    Metadata
    Show full item record
    Citation
    Tan, H. and Lim, H. and Harno, H. 2017. Stochastic diagonal approximate greatest descent in convolutional neural networks, pp. 451-454.
    Source Title
    Proceedings of the 2017 IEEE International Conference on Signal and Image Processing Applications, ICSIPA 2017
    DOI
    10.1109/ICSIPA.2017.8120653
    ISBN
    9781509055593
    School
    Curtin Malaysia
    URI
    http://hdl.handle.net/20.500.11937/65863
    Collection
    • Curtin Research Publications
    Abstract

    © 2017 IEEE. Deep structured of Convolutional Neural Networks (CNN) has recently gained intense attention in development due to its good performance in object recognition. One of the crucial components in CNN is the learning mechanism of weight parameters through backpropagation. In this paper, stochastic diagonal Approximate Greatest Descent (SDAGD) is proposed to train weight parameters in CNN. SDAGD adopts the concept of multistage control system and diagonal Hessian approximation for weight optimization. It can be defined into two-phase optimization. In phase 1, when an initial guessing point is far from the solution, SDAGD constructs local search regions to determine the step length of next iteration at the boundary of search region. Subsequently, when the solution is at the final search region, SDAGD will shift to phase 2 by approximating Newton method to obtain a fast weight convergence. The calculation of Hessian in diagonal approximation results in less computational cost as compared to full Hessian calculation. The experiment showed that SDAGD learning algorithm could achieve misclassification rate of 8.85% on MNIST dataset.

    Related items

    Showing items related by title, author, creator and subject.

    • Integer Least-squares Theory for the GNSS Compass
      Teunissen, Peter (2010)
      Global navigation satellite system (GNSS) carrier phase integer ambiguity resolution is the key to high-precision positioning and attitude determination. In this contribution, we develop new integer least-squares (ILS) ...
    • Stochastic diagonal approximate greatest descent in neural networks
      Tan, H.; Lim, Hann; Harno, H. (2017)
      © 2017 IEEE. Optimization is important in neural networks to iteratively update weights for pattern classification. Existing optimization techniques suffer from suboptimal local minima and slow convergence rate. In this ...
    • Radial effect in stochastic diagonal approximate greatest descent
      Tan, H.; Lim, Hann; Harno, H. (2017)
      © 2017 IEEE. Stochastic Diagonal Approximate Greatest Descent (SDAGD) is proposed to manage the optimization in two stages, (a) apply a radial boundary to estimate step length when the weights are far from solution, (b) ...
    Advanced search

    Browse

    Communities & CollectionsIssue DateAuthorTitleSubjectDocument TypeThis CollectionIssue DateAuthorTitleSubjectDocument Type

    My Account

    Admin

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Follow Curtin

    • 
    • 
    • 
    • 
    • 

    CRICOS Provider Code: 00301JABN: 99 143 842 569TEQSA: PRV12158

    Copyright | Disclaimer | Privacy statement | Accessibility

    Curtin would like to pay respect to the Aboriginal and Torres Strait Islander members of our community by acknowledging the traditional owners of the land on which the Perth campus is located, the Whadjuk people of the Nyungar Nation; and on our Kalgoorlie campus, the Wongutha people of the North-Eastern Goldfields.