Awesome
<div align="center"> <img src="assets/msmamba.png"> <h3>Microscopic-Mamba: Revealing the Secrets of Microscopic Images with Only 4M Parameters</h3>[Paper
]
[Project Page
]
Abstract
In the field of medical microscopic image classification (MIC), CNN-based and Transformer-based models have been extensively studied. However, CNNs struggle with modeling long-range dependencies, limiting their ability to fully utilize semantic information in images. Conversely, Transformers are hampered by the complexity of quadratic computations. To address these challenges, we propose a model based on the Mamba architecture: Microscopic-Mamba. Specifically, we designed the Partially Selected Feed-Forward Network (PSFFN) to replace the last linear layer of the Visual State Space Module (VSSM), enhancing Mamba's local feature extraction capabilities. Additionally, we introduced the Modulation Interaction Feature Aggregation (MIFA) module to effectively modulate and dynamically aggregate global and local features. We also incorporated a parallel VSSM mechanism to improve inter-channel information interaction while reducing the number of parameters. Extensive experiments have demonstrated that our method achieves state-of-the-art performance on five public datasets. Code is available at https://github.com/zs1314/Microscopic-Mamba
Overview
<p align="center"> <img src="assets/overview.jpg" alt="accuracy" width="100%"> </p>π₯The classification performance of Microscopic-Mamba
<p align="center"> <img src="assets/result.png" alt="accuracy" width="100%"> </p>πLet's Get Started!
A. Installation
Note that the code in this repo runs under Linux system.
The repo is based on the VMama repo, thus you need to install it first. The following installation sequence is taken from the VMamba repo.
Step 1: Clone the repository:
Clone this repository and navigate to the project directory:
git clone https://github.com/zs1314/Microscopic-Mamba.git
cd Microscopic-Mamba
Step 2: Environment Setup:
It is recommended to set up a conda environment and installing dependencies via pip. Use the following commands to set up your environment:
Create and activate a new conda environment
conda create -n msmamba
conda activate msmamba
Install dependencies
pip install -r requirements.txt
cd kernels/selective_scan && pip install .
B. Data Preparation
The five datasets RPE, MHIST , SARS ,TissueMnist and MedMf_colon are used for MIC experiments. Please download them and make them have the following folder/file structure:
${DATASET_ROOT} # Dataset root directory, for example: /home/username/data
βββ RPE
βββ train
β βββ class 1
β β βββ00001.png
β β βββ00002.png
β β βββ00003.png
β β ...
β β
β βββ class 2
β β βββ00001.png
β β ...
β β
β βββ class n
β βββ00001.png
β ...
βββ val
β βββ ...
βββ test
β βββ ...
β ...
βββ MHIST
βββ SARS
βββ TissueMnist
βββ MedMf_Colon
Or you can download it from here: baidu Netdisk
C. Model Training
python train.py
D. Model Testing
python test.py
π₯: Before training and testing, configure the relevant parameters in the script. You are better off calculating the mean and std for each data set, which also helps to further improve model performance. get_means.py
π€Acknowledgments
This project is based on VMamba (paper, code). Thanks for their excellent works!!
πQ & A
For any questions, please feel free to contact us.
πReference
If this code or paper contributes to your research, please kindly consider citing our paper and give this repo βοΈ π
@article{zou2024microscopic,
title={Microscopic-Mamba: Revealing the Secrets of Microscopic Images with Just 4M Parameters},
author={Zou, Shun and Zhang, Zhuo and Zou, Yi and Gao, Guangwei},
journal={arXiv preprint arXiv:2409.07896},
year={2024}
}