Custom Visual Guide

Decision Tree Chart
Empty image or helper icon

Sam McKay

CEO & Founder

Decision Tree Chart
Schematic tree-shaped diagram for determining statistical probability using recursive partitioning

Decision trees are probably one of the most common and easily understood decision support tools.

The decision tree learning automatically find the important decision criteria to consider and uses the most intuitive and explicit visual representation.

Current visual implements the popular and widely used tools of recursive partitioning for decision tree construction. Each leaf of the tree is labeled with a class and a probability distribution over the classes. Beside this we use cross validation to estimate the statistical performance of the decision tree.

If the target variable is categorical or has only few possible values the “Classification Tree” is constructed, whereas if the target variable is numeric the result of the visual is “Regression Tree”.

You can control the algorithm parameters and the visual attributes to suit your needs.

Here is how it works:

  • Define the “Target Variable” (exactly one) and “Input Variables” (two or more columns)
  • Controll the basic properties of the tree such as maximum depth and the minimum observation number per leaf
  • If you are the advanced user, control the recursive partitioning and cross-validation parameters

R package dependencies(auto-installed): rpart, rpart.plot, RColorBrewer

This is an open source visual. Get the code from GitHub:


  • Can read and make changes to your document
  • Can send data over the Internet