Title
Boosted Random Forest
Abstract
Machine learning is used in various fields and demand for implementations is increasing. Within machine learning, a Random Forest is a multi-class classifier with high-performance classification, achieved using bagging and feature selection, and is capable of high-speed training and classification. However, as a type of ensemble learning, Random Forest determines classifications using the majority of multiple trees; so many decision trees must be built. Performance increases with the number of decision trees, requiring memory, and decreases if the number of decision trees is decreased. Because of this, the algorithm is not well suited to implementation on small-scale hardware as an embedded system. As such, we have proposed Boosted Random Forest, which introduces a boosting algorithm into the Random Forest learning method to produce high-performance decision trees that are smaller. When evaluated using databases from the UCI Machine learning Repository, Boosted Random Forest achieved performance as good or better than ordinary Random Forest, while able to reduce memory use by 47%. Thus, it is suitable for implementing Random Forests on embedded hardware with limited memory.
Year
DOI
Venue
2015
10.1587/transinf.2014OPP0004
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS
Keywords
Field
DocType
Boosting, Random Forest, machine learning, pattern recognition
Data mining,Decision tree,Feature selection,Computer science,Artificial intelligence,Random forest,Ensemble learning,Alternating decision tree,Decision stump,Pattern recognition,Boosting (machine learning),Machine learning,Gradient boosting
Journal
Volume
Issue
ISSN
E98D
9
1745-1361
Citations 
PageRank 
References 
6
0.57
8
Authors
5
Name
Order
Citations
PageRank
Yohei Mishina160.90
Ryuei Murata260.57
Yuji Yamauchi34310.45
Takayoshi Yamashita437746.83
fujiyoshi5730101.43