Home

Awesome

MAMCA

A method for Automatic Modulation Classification using Mamba structure

A project employing the Selective State Space Model (Mamba) method for Automatic Modulation Classification (AMC) in a scenario of extended signal length.

The increased sequence length complicates the learning process and diminishes accuracy, while simultaneously escalating memories and reduces timeliness. This issue brings the following impacts: Length effects

if our codes helped your reasearch, please consider citing the corresponding submission

@article{zhang2024mamca,<br>   title={MAMCA -- Optimal on Accuracy and Efficiency for Automatic Modulation Classification with Extended Signal Length},<br>   author={Yezhuo Zhang and Zinan Zhou and Yichao Cao and Guangyu Li and Xuanpeng Li},<br>   year={2024},<br>   journal={arXiv preprint arXiv:2405.11263},<br> }

We utilize a denosing unit for better accuracy performance under noise interference, while using Mamba as the backbone for low GPU occupancy and training/inference time.

To related AMC works, as well as sorce code:

To the denosing method employed in our work, as well as source code:

To the Mamba method employed in our work, as well as source code:

Requirements

pip install -r requirements.txt

Training

cd into code/script and do

bash RML2016.10a.sh

Contact

If you have any problem with our code or any suggestions, including discussion on SEI, please feel free to contact