Papers
Topics
Authors
Recent
2000 character limit reached

CART-ELC: Oblique Decision Tree Induction via Exhaustive Search (2505.05402v1)

Published 8 May 2025 in cs.LG, cs.AI, and cs.DS

Abstract: Oblique decision trees have attracted attention due to their potential for improved classification performance over traditional axis-aligned decision trees. However, methods that rely on exhaustive search to find oblique splits face computational challenges. As a result, they have not been widely explored. We introduce a novel algorithm, Classification and Regression Tree - Exhaustive Linear Combinations (CART-ELC), for inducing oblique decision trees that performs an exhaustive search on a restricted set of hyperplanes. We then investigate the algorithm's computational complexity and its predictive capabilities. Our results demonstrate that CART-ELC consistently achieves competitive performance on small datasets, often yielding statistically significant improvements in classification accuracy relative to existing decision tree induction algorithms, while frequently producing shallower, simpler, and thus more interpretable trees.

Summary

Decision tree algorithms are a cornerstone of interpretable machine learning due to their transparency and simplicity in model representation. However, traditional decision tree methods, constrained by axis-aligned splits, often fail to leverage complex relationships between features for optimal classification. This paper introduces CART-ELC, a novel algorithm designed to induce oblique decision trees through exhaustive search over a restricted set of hyperplanes—addressing shortcomings in axis-aligned decision trees and existing oblique tree methods.

The CART-ELC Algorithm

CART-ELC embodies a significant advancement by utilizing linear combinations of features, unrestricted by axis-alignment, thereby fostering shallower, more interpretable trees. This approach involves an exhaustive search of hyperplanes defined by the CART-ELC's hyperparameter rr. Specifically, the algorithm explores unique hyperplanes passing through at least rr samples, where the maximum number of non-zero coefficients is confined by rr. This mechanism is operationalized in two cases: when r=mr = m (the dimensionality of the feature space), the search traverses hyperplanes derived from combinations of samples. Conversely, when r<mr < m, the algorithm restricts the feature selection to a subset, thereby mitigating computational complexity.

Computational Complexity and Empirical Validation

The paper robustly analyses the computational complexity of CART-ELC, recognizing its non-trivial overhead for large datasets. The exhaustive approach incurs a complexity associated with matrix operations and combinatorial hyperplane enumeration, predominantly in the class of Θ((nr)⋅(mr)⋅r(r2+n))\Theta(\binom{n}{r} \cdot \binom{m}{r} \cdot r(r^2 + n)). CART-ELC is deployed across several datasets pertinent to classification tasks, including the star/galaxy and iris datasets, revealing competitive accuracy and efficiency in tree size compared to established algorithms like HHCART, OC1, and CART.

Empirical Results and Insights

Empirical results substantiate CART-ELC's proficiency with strong classification accuracies across small datasets, notably outperforming its peers. The algorithm consistently yielded interpretable trees with fewer nodes, underscoring its utility in scenarios requiring balanced model complexity and predictive power. Importantly, the paper highlights CART-ELC's statistical significance in accuracy improvements for specific datasets using Welch's t-tests and Cohen's d effect sizes.

Limitations and Future Directions

While promising, this paper acknowledges the prohibitive computational demands of CART-ELC on large datasets, suggesting avenues for amelioration. Prospective research might focus on optimized hyperplane subset selection methodologies or integrations with ensemble techniques like random forests and boosting algorithms to reduce computational overhead while maintaining performance scalability. Furthermore, a thorough exploration of CART-ELC's application beyond the datasets analyzed could yield additional insights into its versatility in various domains.

Conclusion

CART-ELC introduces an innovative approach to oblique decision tree induction, addressing key limitations of axis-aligned methods and expanding upon existing oblique algorithms. Its exhaustive search strategy facilitates strong predictive performance and interpretable model structure, holding significant implications for interpretable machine learning. Continued exploration and refinement of this algorithm could unlock its potential in broader applications and larger datasets, driving further advancements in the field of machine learning decision systems.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.