SALFORD PREDICTIVE MODELER

Features List

SPM-short_NEW-logo

Salford Predictive Modeler® 8 General Features

  • Modeling Engine: CART® decision trees
  • Modeling Engine: TreeNet® gradient boosting
  • Modeling Engine: Random Forests® tree ensemble
  • Modeling Engine: MARS® nonlinear regression splines
  • Modeling Engine: GPS regularized regression (LASSO, Elastic Net, Ridge, etc.)
  • Modeling Engine: RuleLearner, incorporating TreeNet’s accuracy plus the interpretability of regression
  • Modeling Engine: ISLE model compression
  • 70+ pre-packaged automation routines for enhanced model building and experimentation
  • Tools to relieve gruntwork, allowing the analyst to focus on the creative aspects of model development.
  • Open Minitab Worksheet (.MTW) functionality

CART® Features

  • Hotspot detection to discover the most important parts of the tree and the corresponding tree rules
  • Variable importance measures to understand the most important variables in the tree
  • Deploy the model and generate predictions in real-time or otherwise
  • User defined splits at any point in the tree
  • Differential lift (also called “uplift” or “incremental response”) modeling for assessing the efficacy of a treatment
  • Automation tools for model tuning and other experiments including
    • Automatic recursive feature elimination for advanced variable selection
    • Experiment with the prior probabilities to obtain a model that achieves better accuracy rates for the more important class
    • Perform repeated cross validation
    • Build CART models on bootstrap samples
    • Build two linked models, where the first one predicts a binary event while the second one predicts a numeric value
    • Discover the impact of different learning and testing partitions

MARS® Features

  • Graphically understand how variables affect the model response
  • Determine the importance of a variable or set of interacting variables
  • Deploy the model and generate predictions in real-time or otherwise
  • Automation tools for model tuning and other experiments including
    • Automatic recursive feature elimination for advanced variable selection
    • Automatically assess the impact of allowing interactions in the model
    • Easily find the best minimum span value
    • Perform repeated cross validation
    • Discover the impact of different learning and testing partitions

TreeNet® Features

  • Graphically understand how variables affect the model response with partial dependency plots
  • Regression loss functions: least squares, least absolute deviation, quantile, Huber-M, Cox survival, Gamma, Negative Binomial, Poisson, and Tweedie
  • Classification loss functions: binary or multinomial
  • Differential lift (also called “uplift” or “incremental response”) modeling
  • Column subsampling to improve model performance and speed up the runtime.
  • Regularized Gradient Boosting (RGBOOST) to increase accuracy.
  • RuleLearner: build interpretable regression models by combining TreeNet gradient boosting and regularized regression (LASSO, Elastic Net, Ridge etc.)
  • ISLE: Build smaller, more efficient gradient boosting models using regularized regression (LASSO, Elastic Net, Ridge, etc.)
  • Variable Interaction Discovery Control
    • Determine definitively whether or not interactions of any degree need to be included
    • Control the interactions allowed or disallowed in the model with Minitab’s patented interaction control language
  • Discover the most important interactions in the model
  • Calibration tools for rare-event modeling
  • Automation tools for model tuning and other experiments including
    • Automatic recursive feature elimination for advanced variable selection
    • Experiment with different learn rates automatically
    • Control the extent of interactions occurring in the model
    • Build two linked models, where the first one predictions a binary event while the second one predicts a numeric value
    • Find the best parameters in your regularized gradient boosting model
    • Perform a stochastic search for the core gradient boosting parameters
    • Discover the impact of different learning and testing partitions

Random Forests® Features

  • Use for classification, regression, or clustering
  • Outlier detection
  • Proximity heat map and multi-dimensional scaling for graphically determining clusters in classification problems (binary or multinomial)
  • Parallel Coordinates Plot for a better understanding of what levels of predictor values lead to a particular class assignment
  • Unsupervised learning: Random Forest creates the proximity matrix and hierarchical clustering techniques are then applied
  • Variable importance measures to understand the most important variables in the model
  • Deploy the model and generate predictions in real-time or otherwise
  • Automation tools for model tuning and other experiments including
    • Automatic recursive feature elimination for advanced variable selection
    • Easily fine tune the random subset size taken at each split in each tree
    • Assess the impact of different bootstrap sample sizes
    • Discover the impact of different learning and testing partitions
intro-logo-cart

CART®

SPM’s CART® modeling engine is the ultimate classification tree that has revolutionized the field of advanced analytics, and inaugurated the current era of data science.

intro-logo-mars

MARS®

The MARS® modeling engine is ideal for users who prefer results in a form similar to traditional regression while capturing essential nonlinearities and interactions.

header-logo-treenet

TreeNet®

TreeNet® Gradient Boosting is SPM’s most flexible and powerful data mining tool, capable of consistently generating extremely accurate models.

intro-logo-random-forests

Random Forests®

Random Forests® is a modeling engine that leverages the power of multiple alternative analyses, randomization strategies, and ensemble learning.

Ready to Build Predictive Models with SPM