Decision Tree

AI Simulator Platform

Decision Tree Visual Lab

Interactive axis-aligned splitting simulator for binary classification.

A decision tree recursively splits data into smaller regions. In this simulator, each split is axis-aligned and uses either x or y with a threshold.

Each internal node asks a rule like x <= 22. Leaf nodes output class probabilities and a final class label.

For class proportions p_k, Gini impurity is:

$$ G = 1 - \sum_k p_k^2 $$

The model tries candidate splits and picks the one that minimizes weighted child impurity.

  • Maximum depth limits tree complexity.
  • Minimum samples avoids unstable micro-splits.
  • Pure leaves (all one class) stop naturally.

Shallower trees usually generalize better, while deeper trees can overfit local noise.

  1. Add points for class A and class B, or load a preset demo pattern.
  2. Train with different max depth / min samples settings.
  3. Observe region boundaries, text tree rules, and split logs.
  4. Compare simple vs complex trees for interpretability and fit quality.

Click the canvas to add samples on grid cells, then train to generate split rules and decision regions.

Model Info
Points: 0 A | 0 B
Last Split Score: -
Decision Tree (Text)

            
Split Calculation Log