Home

Awesome

<div align="center"> <img src="./docs/images/logo/mamba_tabular.jpg" width="400"/>

PyPI PyPI - Downloads docs build docs open issues

📘Documentation | 🛠️Installation | Models | 🤔Report Issues

</div> <div style="text-align: center;"> <h1>Mambular: Tabular Deep Learning (with Mamba)</h1> </div>

Mambular is a Python library for tabular deep learning. It includes models that leverage the Mamba (State Space Model) architecture, as well as other popular models like TabTransformer, FTTransformer, and tabular ResNets. Check out our paper Mambular: A Sequential Model for Tabular Deep Learning, available here.

<h3> Table of Contents </h3>

🏃 Quickstart

Similar to any sklearn model, Mambular models can be fit as easy as this:

from mambular.models import MambularClassifier
# Initialize and fit your model
model = MambularClassifier()

# X can be a dataframe or something that can be easily transformed into a pd.DataFrame as a np.array
model.fit(X, y, max_epochs=150, lr=1e-04)

📖 Introduction

Mambular is a Python package that brings the power of advanced deep learning architectures to tabular data, offering a suite of models for regression, classification, and distributional regression tasks. Designed with ease of use in mind, Mambular models adhere to scikit-learn's BaseEstimator interface, making them highly compatible with the familiar scikit-learn ecosystem. This means you can fit, predict, and evaluate using Mambular models just as you would with any traditional scikit-learn model, but with the added performance and flexibility of deep learning.

🤖 Models

ModelDescription
MambularA sequential model using Mamba blocks Gu and Dao specifically designed for various tabular data tasks.
FTTransformerA model leveraging transformer encoders, as introduced by Gorishniy et al., for tabular data.
MLPA classical Multi-Layer Perceptron (MLP) model for handling tabular data tasks.
ResNetAn adaptation of the ResNet architecture for tabular data applications.
TabTransformerA transformer-based model for tabular data introduced by Huang et al., enhancing feature learning capabilities.
MambaTabA tabular model using a Mamba-Block on a joint input representation described here . Not a sequential model.
TabulaRNNA Recurrent Neural Network for Tabular data. Not yet included in the benchmarks

All models are available for regression, classification and distributional regression, denoted by LSS. Hence, they are available as e.g. MambularRegressor, MambularClassifier or MambularLSS

🏆 Results

Detailed results for the available methods can be found here. Note, that these are achieved results with default hyperparameter and for our splits. Performing hyperparameter optimization could improve the performance of all models.

The average rank table over all models and all datasets is given here:

<div align="center"> <table> <tr> <th style="text-align:center;">Model</th> <th style="text-align:center;">Avg. Rank</th> </tr> <tr> <td style="text-align:center;"><strong>Mambular</strong></td> <td style="text-align:center;"><strong>2.083</strong> <sub>±1.037</sub></td> </tr> <tr> <td style="text-align:center;">FT-Transformer</td> <td style="text-align:center;">2.417 <sub>±1.256</sub></td> </tr> <tr> <td style="text-align:center;">XGBoost</td> <td style="text-align:center;">3.167 <sub>±2.577</sub></td> </tr> <tr> <td style="text-align:center;">MambaTab*</td> <td style="text-align:center;">4.333 <sub>±1.374</sub></td> </tr> <tr> <td style="text-align:center;">ResNet</td> <td style="text-align:center;">4.750 <sub>±1.639</sub></td> </tr> <tr> <td style="text-align:center;">TabTransformer</td> <td style="text-align:center;">6.222 <sub>±1.618</sub></td> </tr> <tr> <td style="text-align:center;">MLP</td> <td style="text-align:center;">6.500 <sub>±1.500</sub></td> </tr> <tr> <td style="text-align:center;">MambaTab</td> <td style="text-align:center;">6.583 <sub>±1.801</sub></td> </tr> <tr> <td style="text-align:center;">MambaTab<sup>T</sup></td> <td style="text-align:center;">7.917 <sub>±1.187</sub></td> </tr> </table> </div>

📚 Documentation

You can find the Mamba-Tabular API documentation here.

🛠️ Installation

Install Mambular using pip:

pip install mambular

🚀 Usage

<h2> Preprocessing </h2>

Mambular simplifies data preprocessing with a range of tools designed for easy transformation of tabular data.

<h3> Data Type Detection and Transformation </h3> <h2> Fit a Model </h2> Fitting a model in mambular is as simple as it gets. All models in mambular are sklearn BaseEstimators. Thus the `.fit` method is implemented for all of them. Additionally, this allows for using all other sklearn inherent methods such as their built in hyperparameter optimization tools.
from mambular.models import MambularClassifier
# Initialize and fit your model
model = MambularClassifier(
    d_model=64,
    n_layers=8,
    numerical_preprocessing="ple",
    n_bins=50
)

# X can be a dataframe or something that can be easily transformed into a pd.DataFrame as a np.array
model.fit(X, y, max_epochs=150, lr=1e-04)

Predictions are also easily obtained:

# simple predictions
preds = model.predict(X)

# Predict probabilities
preds = model.predict_proba(X)
<h2> ⚖️ Distributional Regression with MambularLSS </h2>

MambularLSS allows you to model the full distribution of a response variable, not just its mean. This is crucial when understanding variability, skewness, or kurtosis is important. All Mambular models are available as distributional models.

<h3> Key Features of MambularLSS: </h3> <h3> Available Distribution Classes: </h3>

These distribution classes make MambularLSS versatile in modeling various data types and distributions.

<h3> Getting Started with MambularLSS: </h3>

To integrate distributional regression into your workflow with MambularLSS, start by initializing the model with your desired configuration, similar to other Mambular models:

from mambular.models import MambularLSS

# Initialize the MambularLSS model
model = MambularLSS(
    dropout=0.2,
    d_model=64,
    n_layers=8,
 
)

# Fit the model to your data
model.fit(
    X, 
    y, 
    max_epochs=150, 
    lr=1e-04, 
    patience=10,     
    family="normal" # define your distribution
    )

💻 Implement Your Own Model

Mambular allows users to easily integrate their custom models into the existing logic. This process is designed to be straightforward, making it simple to create a PyTorch model and define its forward pass. Instead of inheriting from nn.Module, you inherit from Mambular's BaseModel. Each Mambular model takes three main arguments: the number of classes (e.g., 1 for regression or 2 for binary classification), cat_feature_info, and num_feature_info for categorical and numerical feature information, respectively. Additionally, you can provide a config argument, which can either be a custom configuration or one of the provided default configs.

One of the key advantages of using Mambular is that the inputs to the forward passes are lists of tensors. While this might be unconventional, it is highly beneficial for models that treat different data types differently. For example, the TabTransformer model leverages this feature to handle categorical and numerical data separately, applying different transformations and processing steps to each type of data.

Here's how you can implement a custom model with Mambular:

  1. First, define your config:
    The configuration class allows you to specify hyperparameters and other settings for your model. This can be done using a simple dataclass.

    from dataclasses import dataclass
    
    @dataclass
    class MyConfig:
        lr: float = 1e-04
        lr_patience: int = 10
        weight_decay: float = 1e-06
        lr_factor: float = 0.1
    
  2. Second, define your model:
    Define your custom model just as you would for an nn.Module. The main difference is that you will inherit from BaseModel and use the provided feature information to construct your layers. To integrate your model into the existing API, you only need to define the architecture and the forward pass.

    from mambular.base_models import BaseModel
    import torch
    import torch.nn
    
    class MyCustomModel(BaseModel):
        def __init__(
            self,
            cat_feature_info,
            num_feature_info,
            num_classes: int = 1,
            config=None,
            **kwargs,
        ):
            super().__init__(**kwargs)
            self.save_hyperparameters(ignore=["cat_feature_info", "num_feature_info"])
    
            input_dim = 0
            for feature_name, input_shape in num_feature_info.items():
                input_dim += input_shape
            for feature_name, input_shape in cat_feature_info.items():
                input_dim += 1 
    
            self.linear = nn.Linear(input_dim, num_classes)
    
        def forward(self, num_features, cat_features):
            x = num_features + cat_features
            x = torch.cat(x, dim=1)
            
            # Pass through linear layer
            output = self.linear(x)
            return output
    
  3. Leverage the Mambular API:
    You can build a regression, classification, or distributional regression model that can leverage all of Mambular's built-in methods by using the following:

    from mambular.models import SklearnBaseRegressor
    
    class MyRegressor(SklearnBaseRegressor):
        def __init__(self, **kwargs):
            super().__init__(model=MyCustomModel, config=MyConfig, **kwargs)
    
  4. Train and evaluate your model:
    You can now fit, evaluate, and predict with your custom model just like with any other Mambular model. For classification or distributional regression, inherit from SklearnBaseClassifier or SklearnBaseLSS respectively.

    regressor = MyRegressor(numerical_preprocessing="ple")
    regressor.fit(X_train, y_train, max_epochs=50)
    

🏷️ Citation

If you find this project useful in your research, please consider cite:

@article{thielmann2024mambular,
  title={Mambular: A Sequential Model for Tabular Deep Learning},
  author={Thielmann, Anton Frederik and Kumar, Manish and Weisser, Christoph and Reuter, Arik and S{\"a}fken, Benjamin and Samiee, Soheila},
  journal={arXiv preprint arXiv:2408.06291},
  year={2024}
}

License

The entire codebase is under MIT license.