基于多通道图卷积神经网络的地海杂波分类方法

李灿 王增福 张效宣 潘泉

李灿, 王增福, 张效宣, 等. 基于多通道图卷积神经网络的地海杂波分类方法[J]. 雷达学报(中英文), 待出版. doi: 10.12000/JR24165
引用本文: 李灿, 王增福, 张效宣, 等. 基于多通道图卷积神经网络的地海杂波分类方法[J]. 雷达学报(中英文), 待出版. doi: 10.12000/JR24165
LI Can, WANG Zengfu, ZHANG Xiaoxuan, et al. Land–Sea clutter classification method based on multi-channel graph convolutional neural networks[J]. Journal of Radars, in press. doi: 10.12000/JR24165
Citation: LI Can, WANG Zengfu, ZHANG Xiaoxuan, et al. Land–Sea clutter classification method based on multi-channel graph convolutional neural networks[J]. Journal of Radars, in press. doi: 10.12000/JR24165

基于多通道图卷积神经网络的地海杂波分类方法

DOI: 10.12000/JR24165
基金项目: 国家自然科学基金(62473317, U21B2008)
详细信息
    作者简介:

    李 灿,博士生,主要研究方向为天波雷达数据处理、深度学习、图深度学习

    王增福,博士,副教授,主要研究方向为天波雷达数据处理、信息融合、传感器管理

    张效宣,博士生,主要研究方向为遥感图像生成与分类

    潘 泉,博士,教授,主要研究方向为信息融合理论及应用、目标跟踪与识别技术、光谱成像及图像处理

    通讯作者:

    王增福 wangzengfu@nwpu.edu.cn

  • 责任主编:许述文 Corresponding Editor: XU Shuwen
  • 中图分类号: TN958.93

Land–Sea Clutter Classification Method Based on Multi-Channel Graph Convolutional Neural Networks

Funds: The National Natural Science Foundation of China (62473317, U21B2008)
More Information
  • 摘要: 地海杂波分类是提升天波超视距雷达目标定位精度的关键技术,其核心是判别距离多普勒(RD)图中每个方位-距离单元背景源自陆地或海洋的过程。基于传统深度学习的地海杂波分类方法需海量高质量且类别均衡的有标签样本,训练时间长,费效比高;此外,其输入为单个方位-距离单元杂波,未考虑样本的类内和类间信息,导致模型性能不佳。针对上述问题,该文通过分析相邻方位-距离单元之间的相关性,将地海杂波数据由欧式空间转换为非欧式空间中的图数据,引入样本之间的关系,并提出一种基于多通道图卷积神经网络(MC-GCN)的地海杂波分类方法。MC-GCN将图数据由单通道分解为多通道,每个通道只包含一种类型的边和一个权重矩阵,通过约束节点信息聚合的过程,能够有效缓解由异质性造成的节点属性误判。该文选取不同季节、不同时刻、不同探测区域RD图,依据雷达参数、数据特性和样本比例,构建包含12种不同场景的地海杂波原始数据集和36种不同配置的地海杂波稀缺数据集,并对MC-GCN的有效性进行验证。通过与最先进的地海杂波分类方法进行比较,该文所提出的MC-GCN在上述数据集中均表现最优,其分类准确率不低于92%。

     

  • 图  1  距离多普勒图

    Figure  1.  Range Doppler map

    图  2  地海杂波图

    Figure  2.  Land/sea clutter map

    图  3  相邻方位-距离单元绝对距离

    Figure  3.  The absolute distance between adjacent azimuth-range cells

    图  4  相邻方位-距离单元余弦相似度

    Figure  4.  The cosine similarity between adjacent azimuth-range cells

    图  5  相邻方位-距离单元皮尔逊相关系数

    Figure  5.  The pearson correlation coefficient between adjacent azimuth-range cells

    图  6  图的构建

    Figure  6.  Graph construction

    图  7  错误的节点聚合

    Figure  7.  Wrong node aggregation

    图  8  多通道示意图

    Figure  8.  Diagram of multiple channels

    图  9  MC-GCN中单个节点更新示意图

    Figure  9.  Diagram of a single node update in MC-GCN

    图  10  基于MC-GCN的地海杂波分类流程图

    Figure  10.  Flowchart of the land/sea clutter classification based on MC-GCN

    图  11  不同特性地海杂波样本

    Figure  11.  Land/sea clutter samples with different characteristics

    图  12  原始数据集中不同类别杂波比例

    Figure  12.  Proportion of different categories of clutter in the original dataset

    图  13  MC-GCN在原始数据集上的混淆矩阵

    Figure  13.  Confusion matrix of MC-GCN on the original dataset.

    图  14  MC-GCN在原始数据集上的PE、RE和F1值

    Figure  14.  PE, RE and F1 values of MC-GCN on the original dataset

    图  15  MC-GCN在原始数据集上的正确率与损失曲线

    Figure  15.  Accuracy and loss curve of MC-GCN on the original dataset

    图  16  MC-GCN的地海杂波分类结果图

    Figure  16.  Figure of the land/sea clutter classification results by MC-GCN

    表  4  原始数据集相关性分析

    Table  4.   Correlation analysis on the original dataset

    分组 标准 多普勒频移 幅值波动 窄带射频干扰
    AD CS PCC AD CS PCC AD CS PCC AD CS PCC
    A组 2.21, 1.79 0.76, 0.84 0.73, 0.82 1.97, 1.87 0.95, 0.95 0.81, 0.82 1.97, 1.59 0.88, 0.92 0.84, 0.89 2.04, 1.60 0.92, 0.95 0.81, 0.88
    B组 1.96, 1.71 0.89, 0.92 0.83, 0.87 1.90, 1.47 0.93, 0.95 0.86, 0.90 1.88, 1.59 0.90, 0.93 0.86, 0.89 2.10, 1.83 0.85, 0.89 0.72, 0.79
    C组 1.95, 1.52 0.91, 0.94 0.84, 0.89 2.11, 1.71 0.90, 0.93 0.76, 0.82 2.13, 1.69 0.97, 0.98 0.71, 0.81 1.99, 1.63 0.91, 0.94 0.83, 0.88
    下载: 导出CSV

    表  1  地海杂波数据集设置(%)

    Table  1.   The setting of land/sea clutter dataset (%)

    分组 样本特性 原始数据集 稀缺数据集 测试集
    训练集 训练集 训练集 训练集
    A组 标准 70 50 30 10 30
    多普勒频移 70 50 30 10 30
    幅值波动 70 50 30 10 30
    窄带射频干扰 70 50 30 10 30
    B组 标准 70 50 30 10 30
    多普勒频移 70 50 30 10 30
    幅值波动 70 50 30 10 30
    窄带射频干扰 70 50 30 10 30
    C组 标准 70 50 30 10 30
    多普勒频移 70 50 30 10 30
    幅值波动 70 50 30 10 30
    窄带射频干扰 70 50 30 10 30
    下载: 导出CSV

    表  2  实验环境

    Table  2.   Experimental environment

    环境版本
    SystemWindows10(64-bit)
    GPUNVIDIA GeForce RTX 3090
    CUDA11.3.1
    python3.8.0
    torch1.11.0
    torchvison0.12.0
    Numpy1.24.3
    matplotlib3.5.1
    dgl1.1.0
    下载: 导出CSV

    表  3  实验参数

    Table  3.   Experimental parameters

    参数数值
    Epoch500
    Learning Rate0.001
    Hidden Units16
    Layers2
    Input Size$ \left[ {{N_R} * {N_A},{N_D}} \right] $
    Output Size$ \left[ {{N_R} * {N_A},3} \right] $
    Beta10.5
    Beta20.999
    下载: 导出CSV

    表  5  原始数据集与稀缺数据集上分类准确率(%)

    Table  5.   Classification accuracy on the original dataset and the scarce dataset (%)

    分组方法标准(AC)多普勒频移(AC)幅值波动(AC)窄带射频干扰(AC)
    70%50%30%10%70%50%30%10%70%50%30%10%70%50%30%10%
    A组MC-GCN97.6296.7895.9395.0996.5296.3396.0596.1996.9096.7396.7696.4096.6396.2896.2493.32
    GCN94.8894.5092.5390.0390.9890.4889.8289.0896.2995.1996.1291.5192.7291.2889.5289.13
    GAT94.9192.7493.1592.8990.8989.6389.1886.0195.0595.9894.8192.8691.6791.2189.8288.87
    TA-GAN94.2192.5990.6190.3792.3691.1688.6986.2794.1293.9492.9591.9992.3791.6990.1188.52
    ResNet1895.4090.2084.8478.6394.4890.7583.6575.4496.5789.8384.4375.0295.4991.5685.4577.48
    DCNN94.2990.7582.8174.5393.4689.7581.8372.9495.9790.6882.7469.5295.0591.1283.2770.64
    B组MC-GCN96.6996.2895.0495.8897.0497.0895.2295.7597.1495.5295.0493.4796.7296.5396.3495.30
    GCN93.5893.1092.8192.2793.8293.7092.9090.7891.0891.2290.3488.9792.1990.1890.7789.51
    GAT94.0893.7293.6292.9692.4092.0491.5790.6492.8691.4689.8788.8392.7291.7791.3489.90
    TA-GAN94.2992.7491.7990.3794.3792.7291.4991.6692.2891.3889.0587.9693.4391.9890.8088.96
    ResNet1896.1190.3983.4275.1895.3689.7281.1974.7494.1490.6681.7974.3893.3389.9083.2875.84
    DCNN95.6889.5781.3574.9694.9288.4379.7671.4893.1588.5178.2471.3992.7485.8078.5672.94
    C组MC-GCN96.7496.6295.5794.7196.5396.5195.9795.9495.7895.9594.8092.8896.8196.4996.3795.92
    GCN92.4391.0388.9687.2990.1091.3889.0688.8591.0991.5090.4187.3992.6191.5390.0989.45
    GAT92.4490.8489.9489.3691.2690.5490.0689.4292.0791.4591.1290.5091.6091.7890.4588.50
    TA-GAN92.3691.6189.4786.6491.7990.9689.5288.3392.4791.9990.8989.8792.4191.4690.5189.73
    ResNet1894.9090.4185.1977.4995.1990.0080.0174.6295.4491.3081.2173.7193.6789.4786.4277.47
    DCNN94.6689.3681.5976.9794.7588.9779.5872.3893.4587.1679.4471.2191.9587.2782.8675.64
    下载: 导出CSV

    表  6  原始数据集在不同通道组合下分类准确率(%)

    Table  6.   Classification accuracy of the original dataset under different channel combinations (%)

    方法通道数标准(AC)多普勒频移(AC)幅值波动(AC)窄带射频干扰(AC)
    123456A组B组C组A组B组C组A组B组C组A组B组C组
    MC-GCN97.6296.6997.7496.5297.0496.5396.9097.1495.7896.6396.7296.81
    90.4191.9387.6285.2989.1491.2879.2784.5393.4992.0585.9891.68
    96.0995.3394.1995.2194.2995.1895.8595.5494.0296.0595.9295.48
    95.2494.9391.7293.3892.4689.4195.4094.4190.9489.0095.4095.83
    94.4393.5994.2491.1493.3088.3193.1291.7491.0693.5292.0693.01
    92.2693.5093.4592.8193.3891.7291.6393.9792.0192.7891.8893.85
    91.9891.3184.0890.4188.1989.6994.0787.7488.6193.4792.8486.98
    下载: 导出CSV

    表  7  不同方法在标准场景下跨域分类准确率(%)

    Table  7.   Cross-domain classification accuracy of different methods in standard scenarios (%)

    训练集方法A→B
    (AC)
    B→A
    (AC)
    A→C
    (AC)
    C→A
    (AC)
    B→C
    (AC)
    C→B
    (AC)
    70%MC-GCN90.9186.1580.5189.4179.2186.51
    ResNet1881.2884.5779.2585.3783.5284.94
    50%MC-GCN87.7586.9689.2789.5666.6085.84
    ResNet1874.3976.8268.7474.5875.7377.48
    30%MC-GCN87.1087.9179.3280.7555.2979.56
    ResNet1862.5467.4963.4666.5769.4968.37
    10%MC-GCN88.9478.2374.9980.5685.2784.58
    ResNet1855.9758.3652.1857.4361.7260.15
    下载: 导出CSV

    表  8  计算复杂度

    Table  8.   Computation complexity

    模型空间复杂度/M时间复杂度/s
    MC-GCN0.15723
    GCN0.01610
    DCNN10.535805
    下载: 导出CSV
  • [1] GUO Zhen, WANG Zengfu, LAN Hua, et al. OTHR multitarget tracking with a GMRF model of ionospheric parameters[J]. Signal Processing, 2021, 182: 107940. doi: 10.1016/j.sigpro.2020.107940.
    [2] LAN Hua, WANG Zengfu, BAI Xianglong, et al. Measurement-level target tracking fusion for over-the-horizon radar network using message passing[J]. IEEE Transactions on Aerospace and Electronic Systems, 2021, 57(3): 1600–1623. doi: 10.1109/TAES.2020.3044109.
    [3] GUO Zhen, WANG Zengfu, HAO Yuhang, et al. An improved coordinate registration for over-the-horizon radar using reference sources[J]. Electronics, 2021, 10(24): 3086. doi: 10.3390/electronics10243086.
    [4] WHEADON N S, WHITEHOUSE J C, MILSOM J D, et al. Ionospheric modelling and target coordinate registration for HF sky-wave radars[C]. 1994 Sixth International Conference on HF Radio Systems and Techniques, York, UK, 1994: 258–266. doi: 10.1049/cp:19940504.
    [5] BARNUM J R and SIMPSON E E. Over-the-horizon radar target registration improvement by terrain feature localization[J]. Radio Science, 1998, 33(4): 1077–1093. doi: 10.1029/98RS00831.
    [6] TURLEY M D E, GARDINER-GARDEN R S, and HOLDSWORTH D A. High-resolution wide area remote sensing for HF radar track registration[C]. 2013 International Conference on Radar, Adelaide, SA, Australia, 2013: 128–133. doi: 10.1109/RADAR.2013.6651973.
    [7] JIN Zhenlu, PAN Quan, ZHAO Chunhui, et al. SVM based land/sea clutter classification algorithm[J]. Applied Mechanics and Materials, 2012, 236/237: 1156–1162. doi: 10.4028/www.scientific.net/AMM.236-237.1156.
    [8] 王俊, 郑彤, 雷鹏, 等. 深度学习在雷达中的研究综述[J]. 雷达学报, 2018, 7(4): 395–411. doi: 10.12000/JR18040.

    WANG Jun, ZHENG Tong, LEI Peng, et al. Study on deep learning in radar[J]. Journal of Radars, 2018, 7(4): 395–411. doi: 10.12000/JR18040.
    [9] 何密, 平钦文, 戴然. 深度学习融合超宽带雷达图谱的跌倒检测研究[J]. 雷达学报, 2023, 12(2): 343–355. doi: 10.12000/JR22169.

    HE Mi, PING Qinwen, and DAI Ran. Fall detection based on deep learning fusing ultrawideband radar spectrograms[J]. Journal of Radars, 2023, 12(2): 343–355. doi: 10.12000/JR22169.
    [10] CHEN Xiaolong, SU Ningyuan, HUANG Yong, et al. False-alarm-controllable radar detection for marine target based on multi features fusion via CNNs[J]. IEEE Sensors Journal, 2021, 21(7): 9099–9111. doi: 10.1109/JSEN.2021.3054744.
    [11] LI Can, WANG Zengfu, ZHANG Zhishan, et al. Sea/land clutter recognition for over-the-horizon radar via deep CNN[C]. 2019 International Conference on Control, Automation and Information Sciences, Chengdu, China, 2019: 1–5. doi: 10.1109/ICCAIS46528.2019.9074545.
    [12] 李灿, 张钰, 王增福, 等. 基于代数多重网格的天波超视距雷达跨尺度地海杂波识别方法[J]. 电子学报, 2022, 50(12): 3021–3029. doi: 10.12263/DZXB.20220389.

    LI Can, ZHANG Yu, WANG Zengfu, et al. Cross-scale land/sea clutter classification method for over-the-horizon radar based on algebraic multigrid[J]. Acta Electronica Sinica, 2022, 50(12): 3021–3029. doi: 10.12263/DZXB.20220389.
    [13] ZHANG Xiaoxuan, WANG Zengfu, LU Kun, et al. Data augmentation and classification of sea-land clutter for over-the-horizon radar using AC-VAEGAN[J]. IEEE Transactions on Geoscience and Remote Sensing, 2023, 61: 5104416. doi: 10.1109/TGRS.2023.3274296.
    [14] ZHANG Xiaoxuan, LI Yang, PAN Quan, et al. Triple loss adversarial domain adaptation network for cross-domain sea-land clutter classification[J]. IEEE Transactions on Geoscience and Remote Sensing, 2023, 61: 5110718. doi: 10.1109/TGRS.2023.3328302.
    [15] ZHANG Xiaoxuan, WANG Zengfu, JI Mingyue, et al. A sea-land clutter classification framework for over-the-horizon radar based on weighted loss semi-supervised generative adversarial network[J]. Engineering Applications of Artificial Intelligence, 2024, 133: 108526. doi: 10.1016/j.engappai.2024.108526.
    [16] ZHOU Jie, CUI Ganqu, HU Shengding, et al. Graph neural networks: A review of methods and applications[J]. AI Open, 2020, 1: 57–81. doi: 10.1016/j.aiopen.2021.01.001.
    [17] WU Zonghan, PAN Shirui, CHEN Fengwen, et al. A comprehensive survey on graph neural networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2021, 32(1): 4–24. doi: 10.1109/TNNLS.2020.2978386.
    [18] WU Lingfei, CHEN Yu, SHEN Kai, et al. Graph neural networks for natural language processing: A survey[J]. Foundations and Trends® in Machine Learning, 2023, 16(2): 119–328. doi: 10.1561/2200000096.
    [19] YUAN Hao, YU Haiyang, GUI Shurui, et al. Explainability in graph neural networks: A taxonomic survey[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45(5): 5782–5799. doi: 10.1109/TPAMI.2022.3204236.
    [20] SU Ningyuan, CHEN Xiaolong, GUAN Jian, et al. Maritime target detection based on radar graph data and graph convolutional network[J]. IEEE Geoscience and Remote Sensing Letters, 2022, 19: 4019705. doi: 10.1109/LGRS.2021.3133473.
    [21] SU Ningyuan, CHEN Xiaolong, GUAN Jian, et al. Radar maritime target detection via spatial-temporal feature attention graph convolutional network[J]. IEEE Transactions on Geoscience and Remote Sensing, 2024, 62: 5102615. doi: 10.1109/TGRS.2024.3358862.
    [22] LANG Ping, FU Xiongjun, DONG Jian, et al. A novel radar signals sorting method via residual graph convolutional network[J]. IEEE Signal Processing Letters, 2023, 30: 753–757. doi: 10.1109/LSP.2023.3287404.
    [23] FENT F, BAUERSCHMIDT P, and LIENKAMP M. RadarGNN: Transformation invariant graph neural network for radar-based perception[C]. The IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Vancouver, BC, Canada, 2023: 182–191. doi: 10.1109/CVPRW59228.2023.00023.
    [24] VELIČKOVIĆ P. Everything is connected: Graph neural networks[J]. Current Opinion in Structural Biology, 2023, 79: 102538. doi: 10.1016/j.sbi.2023.102538.
    [25] XIAO Shunxin, WANG Shiping, DAI Yuanfei, et al. Graph neural networks in node classification: Survey and evaluation[J]. Machine Vision and Applications, 2022, 33(1): 4. doi: 10.1007/s00138-021-01251-0.
    [26] NIVEN E B and DEUTSCH C V. Calculating a robust correlation coefficient and quantifying its uncertainty[J]. Computers & Geosciences, 2012, 40: 1–9. doi: 10.1016/j.cageo.2011.06.021.
    [27] KIPF T N and WELLING M. Semi-supervised classification with graph convolutional networks[EB/OL]. https://arxiv.org/abs/1609.02907, 2017.
    [28] CAI Lei, LI Jundong, WANG Jie, et al. Line graph neural networks for link prediction[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 44(9): 5103–5113. doi: 10.1109/TPAMI.2021.3080635.
    [29] XIE Yu, LIANG Yanfeng, GONG Maoguo, et al. Semisupervised graph neural networks for graph classification[J]. IEEE Transactions on Cybernetics, 2023, 53(10): 6222–6235. doi: 10.1109/TCYB.2022.3164696.
    [30] LIAO Wenlong, BAK-JENSEN B, PILLAI J R, et al. A review of graph neural networks and their applications in power systems[J]. Journal of Modern Power Systems and Clean Energy, 2022, 10(2): 345–360. doi: 10.35833/MPCE.2021.000058.
    [31] MCPHERSON M, SMITH-LOVIN L, and COOK J M. Birds of a feather: Homophily in social networks[J]. Annual Review of Sociology, 2001, 27: 415–444. doi: 10.1146/annurev.soc.27.1.415.
    [32] VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph attention networks[EB/OL]. https://doi.org/10.48550/arXiv.1710.10903, 2017.
    [33] DU Jian, ZHANG Shanghang, WU Guanhang, et al. Topology adaptive graph convolutional networks[EB/OL]. https://doi.org/10.48550/arXiv.1710.10370, 2018.
  • 加载中
图(16) / 表(8)
计量
  • 文章访问数:  37
  • HTML全文浏览量:  23
  • PDF下载量:  149
  • 被引次数: 0
出版历程
  • 收稿日期:  2024-08-15
  • 修回日期:  2024-10-03

目录

    /

    返回文章
    返回