Home

Awesome

<img src="logo.png" alt="drawing" width="200"/>

buy me caffeine

PyPI version Downloads Documentation Status Code Style

ARFS readthedocs

All relevant feature selection

All relevant feature selection means trying to find all features carrying information usable for prediction, rather than finding a possibly compact subset of features on which some particular model has a minimal error. This might include redundant predictors. All relevant feature selection is model agnostic in the sense that it doesn't optimize a scoring function for a specific model but rather tries to select all the predictors which are related to the response.

This package implements 3 different methods (Leshy is an evolution of Boruta, BoostAGroota is an evolution of BoostARoota and GrootCV is a new one). They are sklearn compatible. See hereunder for details about those methods. You can use any sklearn compatible estimator with Leshy and BoostAGroota but I recommend lightGBM. It's fast, accurate and has SHAP values builtin. It also provides a module for performing preprocessing and perform basic feature selection (autobinning, remove columns with too many missing values, zero variance, high-cardinality, highly correlated, etc.). Examples and detailled methods hereunder.

Moreover, as an alternative to the all relevant problem, the ARFS package provides a MRmr feature selection which, theoretically, returns a subset of the predictors selected by an arfs method. ARFS also provides a LASSO feature selection which works especially well for (G)LMs and GAMs. You can combine Lasso with the TreeDiscretizer for introducing non-linearities into linear models and perform feature selection.

Please note that one limitation of the lasso is that it treats the levels of a categorical predictor individually. However, this issue can be addressed by utilizing the TreeDiscretizer, which automatically bins numerical variables and groups the levels of categorical variables.

Installation

$ pip install arfs

REM: If you're interested in using the fastshap option, you'll need to install fasttreeshap first. For a smooth installation process, I suggest using conda install -c conda-forge fasttreeshap since the c++ source code requires compilation. Using pip may involve additional dependencies, such as requiring VS for compiling the c++ code.

Example

Working examples for:

For imbalanced classification:

Boruta

The Boruta algorithm tries to capture all the important features you might have in your dataset with respect to an outcome variable. The procedure is the following:

At every iteration, the algorithm compares the Z-scores of the shuffled copies of the features and the original features to see if the latter performed better than the former. If it does, the algorithm will mark the feature as important. In essence, the algorithm is trying to validate the importance of the feature by comparing with randomly shuffled copies, which increases the robustness. This is done by simply comparing the number of times a feature did better with the shadow features using a binomial distribution. Since the whole process is done on the same train-test split, the variance of the variable importance comes only from the different re-fit of the model over the different iterations.

BoostARoota

BoostARoota follows closely the Boruta method but modifies a few things:

In the spirit, the same heuristic than Boruta but using Boosting (originally Boruta was supporting only random forest). The validation of the importance is done by comparing to the maximum of the median var. imp of the shadow predictors (in Boruta, a statistical test is performed using the Z-score). Since the whole process is done on the same train-test split, the variance of the variable importance comes only from the different re-fit of the model over the different iterations.

Modifications to Boruta and BoostARoota

I forked both Boruta and BoostARoota and made the following changes (under PR):

Boruta --> Leshy:

BoostARoota --> BoostAGroota:

GrootCV, a new method

New: GrootCV:

References

Theory

Applications