|Title:||RuleCOSI: Rule extraction for interpreting classification tree ensembles|
|Period:||2018 – Present|
Nowadays, Machine Learning is widely used in practical applications for solving problems that require predictive analytics. Several new methods are constantly presented in the field, incrementally improving the performance of the older models. However, the improvement in predictive performance usually comes with an increment in the model complexity, making the decision mechanisms of the models difficult to be understood by human intuition. Therefore, the purpose of this research project was to increase the interpretability of tree ensembles for classification, as it is shown in Figure 1.
Interpretability is the degree to which a human can understand the cause of a decisionMiller, 2017
For this purpose, RuleCOSI (Rule COmbination and SImplification) a novel heuristic method that extracts, combines and simplifies decision rules from ensembles was presented. The initial algorithm was published in this academic paper  in 2019. My research evolved since then and it was the main topic of my doctoral dissertation in 2020. I recently published the extended version, RuleCOSI+ in the top-ranked journal Information Fusion . In this short post I introduce the main characteristics of the most recent version of the algorithm and show a small example result.
The algorithm has three basic steps as it is depicted in Figure 2.
The first step is to extract a ruleset from each of the trees forming the tree ensemble. this is done with a simple procedure in which a rule is created from the paths from the node root in the tree to each of the leaf nodes.
The second step is to make a combination of all the feature space of the rules. The final step is to generalize and simplify the rules based on pessimistic error.
The output of the algorithm is a single set of decision rules that are much simpler and have a similar performance to that of the tree ensemble.
RuleCOSI+ is able to handle two types of tree ensembles: Boosting and Bagging. The python library can work with several implementations of this ensemble types, such as Random Forests, XGBoost, CatBoost and Light GBM.
Here I present a example using the the UCI steel plates faults dataset. The dataset contains 27 indicators that approximately describe the geometric shape of the defect and its outline. The task is to classify the type of surface defect. Because RuleCOSI+ can only work with binary classification problems, I considered for this example the dirtiness fault type.
The first step is to train a tree ensemble. In this case I trained an XGBoost ensemble with 50 trees. The F-measure of this model is 0.9958. The first 10 trees are shown in Figure 4.
After applying RuleCOSI+ to the result, the simplified ruleset has just 7 rules, with an F-measure value of 0.9926. The generated rules are presented in Figure 5.
Tree ensembles are widely used methods used for improving classification performance in many domains, including fault detection in manufacturing. However the complexity of the ensembles makes it very hard to be interpreted by humans.
The results of RuleCOSI+ were satisfactory in improving the interpretability of tree ensembles without decreasing its classification performance.
 Obregon, J., Kim, A., & Jung, J. Y. (2019). RuleCOSI: Combination and simplification of production rules from boosted decision trees for imbalanced classification. Expert Systems with Applications, 126, 64-82.
 Obregon, J. & Jung, J. Y. (2022). RuleCOSI+: Rule Extraction for Interpreting Classification Tree Ensembles. Information Fusion, in press.