Show simple item record

dc.contributor.authorAn, Senjian
dc.contributor.authorLiu, Wan-Quan
dc.contributor.authorVenkatesh, Svetha
dc.date.accessioned2017-01-30T15:04:43Z
dc.date.available2017-01-30T15:04:43Z
dc.date.created2009-03-05T00:58:23Z
dc.date.issued2007
dc.identifier.citationAn, Senjian and Liu, Wan-Quan and Venkatesh, Svetha. 2007. Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recognition. 40 (8): pp. 2154-2162.
dc.identifier.urihttp://hdl.handle.net/20.500.11937/43097
dc.identifier.doi10.1016/j.patcog.2006.12.015
dc.description.abstract

Given n training examples, the training of a least squares support vector machine (LS-SVM) or kernel ridge regression (KRR) corresponds to solving a linear system of dimension n. In cross-validating LS-SVM or KRR, the training examples are split into two distinct subsets for a number of times (l) wherein a subset of m examples are used for validation and the other subset of (n-m) examples are used for training the classifier. In this case l linear systems of dimension (n-m) need to be solved. We propose a novel method for cross-validation (CV) of LS-SVM or KRR in which instead of solving l linear systems of dimension (n-m), we compute the inverse of an n dimensional square matrix and solve l linear systems of dimension m, thereby reducing the complexity when l is large and/or m is small. Typical multi-fold, leave-one-out cross-validation (LOO-CV) and leave-many-out cross-validations are considered. For five-fold CV used in practice with five repetitions over randomly drawn slices, the proposed algorithm is approximately four times as efficient as the naive implementation. For large data sets, we propose to evaluate the CV approximately by applying the well-known incomplete Cholesky decomposition technique and the complexity of these approximate algorithms will scale linearly on the data size if the rank of the associated kernel matrix is much smaller than n. Simulations are provided to demonstrate the performance of LS-SVM and the efficiency of the proposed algorithm with comparisons to the naive and some existent implementations of multi-fold and LOO-CV.

dc.publisherElsevier Science Inc
dc.titleFast cross-validation algorithms for least squares support vector machine and kernel ridge regression
dc.typeJournal Article
dcterms.source.volume40
dcterms.source.number8
dcterms.source.startPage2154
dcterms.source.endPage2162
dcterms.source.issn00313203
dcterms.source.titlePattern Recognition
curtin.note

The link to the journal’s home page is: http://www.elsevier.com/wps/find/journaldescription.cws_home/328/description#description

curtin.note

Copyright © 2007 Elsevier Ltd. All rights reserved

curtin.accessStatusFulltext not available
curtin.facultySchool of Electrical Engineering and Computing
curtin.facultyDepartment of Computing
curtin.facultyFaculty of Science and Engineering


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record