Attribute Importance

Attribute Importance (AI) provides an automated solution for improving the speed and possibly the accuracy of classification models built on data tables with a large number of attributes.

If the importance of an attribute is a negative number, then that attribute is not correlated with the target.

The time required to build ODM classification models increases with the number of attributes. Attribute Importance identifies a proper subset of the attributes that are most relevant to predicting the target. Model building can proceed using the selected attributes only.

Using fewer attributes does not necessarily result in lost predictive accuracy. Using too many attributes (especially those that are "noise") can affect the model and degrade its performance and accuracy. Mining using the smallest number of attributes can save significant computing time and may build better models.

The decision tree and Adaptive Bayes Nework algorithms do internal feature reduction; for these kinds of models, it is not necessary to create an AI model to reduce the number of features. Even for these algorithms, reducing the number of features may result in better performance.

ODM Attribute Importance models use the Predictor Variance Algorithm.