Home

Awesome

[AAAI 2025] SparX: A Sparse Cross-Layer Connection Mechanism for Hierarchical Vision Mamba and Transformer Networks

This is an official PyTorch implementation of "SparX: A Sparse Cross-Layer Connection Mechanism for Hierarchical Vision Mamba and Transformer Networks".

Introduction

SparX is a new sparse cross-layer connection mechanism to effectively improve cross-layer feature interaction and reuse in vision backbone networks.

<center> <img src="images/arch.jpg" width="70%" height="auto"> </center>

Image Classification

1. Requirements

We highly suggest using our provided dependencies to ensure reproducibility:

# Environments:
cuda==12.1
python==3.10
# Packages:
torch==2.3.1
timm==0.6.12
# Other Dependencies:
cd kernels/selective_scan && pip install .

2. Data Preparation

Prepare ImageNet with the following folder structure, you can extract ImageNet by this script.

│imagenet/
├──train/
│  ├── n01440764
│  │   ├── n01440764_10026.JPEG
│  │   ├── n01440764_10027.JPEG
│  │   ├── ......
│  ├── ......
├──val/
│  ├── n01440764
│  │   ├── ILSVRC2012_val_00000293.JPEG
│  │   ├── ILSVRC2012_val_00002138.JPEG
│  │   ├── ......
│  ├── ......

3. Main Results on ImageNet-1K with Pretrained Models

ModelsInput SizeFLOPs (G)Params (M)Top-1 (%)Download
SparX-Mamba-T224x2245.22583.5model
SparX-Mamba-S224x2249.34784.2model
SparX-Mamba-B224x22415.98484.5model

4. Train

To train SparX-Mamba models on ImageNet-1K with 8 gpus (single node), run:

bash scripts/train_sparx_mamba_t.sh # train SparX-Mamba-T
bash scripts/train_sparx_mamba_s.sh # train SparX-Mamba-S
bash scripts/train_sparx_mamba_b.sh # train SparX-Mamba-B

5. Validation

To evaluate SparX-Mamba on ImageNet-1K, run:

MODEL=sparx_mamba_t # sparx_mamba_{t, s, b}
python3 validate.py \
/path/to/imagenet \
--model $MODEL -b 128 \
--pretrained # or --checkpoint /path/to/checkpoint 

Citation

If you find this project useful for your research, please consider citing:

@article{lou2024sparx,
  title={SparX: A Sparse Cross-Layer Connection Mechanism for Hierarchical Vision Mamba and Transformer Networks},
  author={Lou, Meng and Fu, Yunxiang and Yu, Yizhou},
  journal={arXiv preprint arXiv:2409.09649},
  year={2024}
}

Acknowledgment

Our implementation is mainly based on the following codebases. We gratefully thank the authors for their wonderful works.

timm
mmdet
mmseg
VMamba

Contact

If you have any questions, please feel free to create issues or contact me.