Home

Awesome

Deep Learning Based Automatic Modulation Recognition: Models, Datasets, and Challenges

Source code for the paper "Deep Learning Based Automatic Modulation Recognition: Models, Datasets, and Challenges", which is published in Digital Signal Processing.

Representative and up-to-date models in the AMR field are implemented on four different datasets (RML2016.10a, RML2016.10b, RML2018.01a, HisarMod2019.1), providing a unified reference for interested researchers.

The article is available here:Deep Learning Based Automatic Modulation Recognition: Models, Datasets, and Challenges

If you have any question, please contact e-mail: zhangxx8023@gmail.com

Abstract

Automatic modulation recognition (AMR) detects the modulation scheme of the received signals for further signal processing without needing prior information, and provides the essential function when such information is missing. Recent breakthroughs in deep learning (DL) have laid the foundation for developing high-performance DL-AMR approaches for communications systems. Comparing with traditional modulation detection methods, DL-AMR approaches have achieved promising performance including high recognition accuracy and low false alarms due to the strong feature extraction and classification abilities of deep neural networks. Despite the promising potential, DL-AMR approaches also bring concerns to complexity and explainability, which affect the practical deployment in wireless communications systems. This paper aims to present a review of the current DL-AMR research, with a focus on appropriate DL models and benchmark datasets. We further provide comprehensive experiments to compare the state of the art models for single-input-single-output (SISO) systems from both accuracy and complexity perspectives, and propose to apply DL-AMR in the new multiple-input-multiple-output (MIMO) scenario with precoding. Finally, existing challenges and possible future research directions are discussed.

Content

Experimental comparison for SISO system

Accuracy

Recognition accuracy comparison of the state-of-the-art models on (a) RML2016.10a, (b) RML2016.10b, (c) RML2018.01a, (d) HisarMod2019.1 Fig.1 Recognition accuracy comparison of the state-of-the-art models on (a) RML2016.10a, (b) RML2016.10b, (c) RML2018.01a, (d) HisarMod2019.1.

Parameter Comparison

Table1 Model size and complexity comparison on the four datasets (A: RML2016.10a, B: RML2016.10b, C: RML2018.01a, D: HisarMod2019.1). 1667809469605

Confusion matrix

combine_revise2022512_r Fig.2 Confusion matrices. A, B and C represent the confusion matrices obtained on the RML2016.10a, RML2016.10b, and RML2018.01a, respectively. The numerical indexes 1 - 14 denote CNN1, CNN2, MCNET, IC-AMCNET, ResNet, DenseNet, GRU, LSTM, DAE, MCLDNN, CLDNN, CLDNN2, CGDNet, PET-CGDNN.

Dataset

Table2 Main AMR open datasets for SISO systems. 1658233963147

DatasetLinkNotes
RML2016.10a, RML2016.10b, RML2018.01aRMLIf RML2018 dataset is too large, you can use SubsampleRML2018.py to sample the dataset to get a partial dataset for experimentation.
HisarMod2019.1HisarModIn our experiments, the dataset was converted from a .CSV file to a .MAT file, which can be found in Link.

Related Papers

ModelPaper namePublication year
CNN1Convolutional Radio Modulation Recognition Networks2016
CNN2Robust and Fast Automatic Modulation Classification with CNN under Multipath Fading Channels2020
MCNETMCNet: An Efficient CNN Architecture for Robust Automatic Modulation Classification2020
IC-AMCNETCNN-Based Automatic Modulation Classification for Beyond 5G Communications2020
ResNetDeep neural network architectures for modulation classification2017
DenseNetDeep neural network architectures for modulation classification2017
GRUAutomatic Modulation Classification using Recurrent Neural Networks2017
LSTMDeep Learning Models for Wireless Signal Classification With Distributed Low-Cost Spectrum Sensors2018
DAEReal-Time Radio Technology and Modulation Classification via an LSTM Auto-Encoder2022
MCLDNNA Spatiotemporal Multi-Channel Learning Framework for Automatic Modulation Recognition2020
CLDNNDeep Architectures for Modulation Recognition2017
CLDNN2Deep neural network architectures for modulation classification2017
CGDNetCGDNet: Efficient Hybrid Deep Learning Model for Robust Automatic Modulation Recognition2021
PET-CGDNNAn Efficient Deep Learning Model for Automatic Modulation Recognition Based on Parameter Estimation and Transformation2021
1DCNN-PFAutomatic Modulation Classification Using Parallel Fusion of Convolutional Neural Networks2019

Environment

These models are implemented in Keras, and the environment setting is:

Remarks

You will need to download the appropriate dataset and change the flie path to the corresponding dataset in your code. There is no guarantee that the code can run sucessfully under other environmental configurations, but there may be performance differences due to different hardware conditions.

About DAE: In the author's open source code, decoder uses the TimeDistributed layer. In our initial implementation, decoder unfolds the data and uses a fully connected layer to reconstruct the input, so the difference is described here. (Source code for DAE) We updated the DAE source code and experimental results with TimeDistributed layer as decoder in our website.

Acknowledgement

Our code is partly based on leena201818. Thanks leena201818 and wzjialang for their great work!

Citation

Please cite the literature we refer to if they are helpful to your work. If our work is helpful to your research, please cite:

@article{ZHANG2022103650,
    title={Deep Learning Based Automatic Modulation Recognition: Models, Datasets, and Challenges},
    author={Fuxin Zhang and Chunbo Luo and Jialang Xu and Yang Luo and FuChun Zheng},
    journal={Digital Signal Processing},
    year={2022},
    doi = {https://doi.org/10.1016/j.dsp.2022.103650}
}