Automatic Composite-Modulation Classification Using Ultra Lightweight Deep-Learning Network Based on Cyclic-Paw-Print

Document Type

Article

Publication Date

6-1-2024

Abstract

Automatic composite-modulation classification (ACMC) has been considered as an essential function in the next generation intelligent telemetry, tracking & command (TT&C), cognitive space communications, and space surveillance. This paper introduces a novel ACMC scheme using the cyclic-paw-print extracted from the composite-modulation (CM) signals. In this new framework, the cyclic-spectrum analysis is first invoked to acquire the polyspectra of the received CM signals corrupted by different fading channels. Then, a new feature, namely cyclic-paw-print (CPP), is established upon the image representation of the cyclic spectrum, which can be robust against channel noise. Then, a highly-efficient ultra lightweight deep-learning network (ULWNet), which takes the CPPs as the input features, is designed to identify the composite modulation type. Our proposed new scheme can greatly improve the computational efficiencies incurred by the existing deep-learning networks and capture more reliable features latent in CM signals to result in an excellent classification accuracy. Monte Carlo simulation results demonstrate the effectiveness and the superiority of our proposed new ACMC scheme to the existing deep-learning networks.

Publication Source (Journal or Book title)

IEEE Transactions on Cognitive Communications and Networking

First Page

866

Last Page

879

This document is currently not available here.

Share

COinS