Combining classifiers in a tree structure
Access Status
Authors
Date
2008Type
Metadata
Show full item recordCitation
Source Title
ISBN
School
Collection
Abstract
This paper presents a method for combining classifiers in a tree structure, where each node of the tree contains single hypothesis trained in respective region of the feature space. All base classifiers are then combined using weighted average. Majority vote and Newton-Raphson numerical optimization are used for fitting the coefficients in the additive model. Two loss functions (quadratic and boosting-like exponential) as well as new splitting criteria for inducing the tree are examined within proposed framework. The idea of combining classifiers in a tree structure is then compared with other iteratively built classifiers: Adaboost.MH and MART (multiple additive regression trees). The experiments were conducted with the usage of well-known databases from the UCI Repository and the ELENA project. © 2008 IEEE.
Related items
Showing items related by title, author, creator and subject.
-
Rabiei, Minou (2011)In hydrocarbon production, more often than not, oil is produced commingled with water. As long as the water production rate is below the economic level of water/oil ratio (WOR), no water shutoff treatment is needed. ...
-
Rabiei, M.; Gupta, Ritu (2013)In hydrocarbon production, certain amount of water production is inevitable and sometimes even necessary. Problems arise when water rate exceeds the WOR (water/oil ratio) economic level, producing no or little oil with ...
-
Chow, Chi Ngok (2010)The largest wool exporter in the world is Australia, where wool being a major export is worth over AUD $2 billion per year and constitutes about 17 per cent of all agricultural exports. Most Australian wool is sold by ...