Home

Awesome

MAMC

A method for Automatic Modulation Classification using Mamba structure

A project employing the Selective State Space Model (Mamba) method for Automatic Modulation Classification (AMC) in a scenario of extended signal length.

The increased sequence length complicates the learning process and diminishes accuracy, while simultaneously escalating memories and reduces timeliness. This issue brings the following impacts: Length effects

if our codes helped your reasearch, please consider citing the corresponding submission

arXiv

@article{zhang2024MAMC,<br>   title={MAMCA -- Optimal on Accuracy and Efficiency for Automatic Modulation Classification with Extended Signal Length},<br>   author={Yezhuo Zhang and Zinan Zhou and Yichao Cao and Guangyu Li and Xuanpeng Li},<br>   year={2024},<br>   journal={arXiv preprint arXiv:2405.11263},<br> }

IEEE Communications Letters (Early Access)

MAMC

@article{10705364,<br>   author={Zhang, Yezhuo and Zhou, Zinan and Cao, Yichao and Li, Guangyu and Li, Xuanpeng},<br>   journal={IEEE Communications Letters}, <br>   title={MAMC - Optimal on Accuracy and Efficiency for Automatic Modulation Classification with Extended Signal Length}, <br>   year={2024},<br>   volume={},<br>   number={},<br>   pages={1-1},<br>   doi={10.1109/LCOMM.2024.3474519}<br> }

We utilize a denosing unit for better accuracy performance under noise interference, while using Mamba as the backbone for low GPU occupancy and training/inference time.

To related AMC works, as well as sorce code:

To the denosing method employed in our work, as well as source code:

To the Mamba method employed in our work, as well as source code:

Requirements

pip install -r requirements.txt

Training

cd into code/script and do

bash RML2016.10a.sh

Contact

If you have any problem with our code or any suggestions, including discussion on SEI, please feel free to contact