
Citation: | LI Yi, DU Lan, ZHOU Ke’er, et al. Deep network for SAR target recognition based on attribute scattering center convolutional kernel modulation[J]. Journal of Radars, 2024, 13(2): 443–456. doi: 10.12000/JR24001 |
由于脉冲雷达发射的大功率信号在城市环境下容易干扰电台等其他通讯设备,可以借助通信信号来探测目标。近年来出现的多载波调制(Multi-Carrier Modulation, MCM)技术[1]引起了人们的关注,由此发展而来的正交频分复用(Orthogonal Frequency Division Multiplex, OFDM)[2]技术采用多路正交子载波进行信号调制,具有频率分集和波形分集的潜力,在通信、雷达一体化的发展背景下有着重要的研究价值。
相位编码OFDM信号具有多普勒高分辨力,用于雷达动目标检测时可以更精确地估计目标速度。不同的相位编码信号具有不同的自相关特性。霍夫曼编码可以降低信号自相关函数的整体旁瓣水平,但其包络峰均比(Peak-to-Mean Envelope Power Ratio, PMEPR)明显增大[3]。研究表明,barker码序列具有较好的综合性能[4],本文将以13位barker码作为相位编码序列。
最近10年,国内外在OFDM信号特性与波形设计上取得了较好的研究成果[5–7],因此OFDM雷达的信号处理问题受到了研究者的关注。张卫等人[8]利用Keystone变换在信号子载波域、快时间域和慢时间域进行联合解耦合处理,解决了目标的跨距离-多普勒单元走动问题,进而可估计出匀速运动下的多目标参数信息,但计算量较大;Lellouch等人[9]利用回波与发射信号的载频相位信息得到了点目标距离及径向速度估计,但要求目标在一个距离门内运动,即不发生越距离单元走动现象。除此以外,还需进一步研究OFDM雷达回波处理中面临的一些特殊问题,如速度补偿和多普勒解模糊。
在信号处理领域,最大似然估计是一种渐进有效估计量,但是对于多测量的非线性模型而言其计算量较大,不利于实际应用。为了提高计算效率,本文借鉴文献[10]中MIMO雷达信号处理的思路,结合OFDM信号多载波正交结构的特点,对信号进行通道分离,形成多通道信号。通过相关处理得到不同子载波上的距离像;利用Keystone变换进行速度补偿并解多普勒模糊,对同一载波的相同距离单元进行脉冲多普勒处理,得到每个子载波对应的多普勒频谱;进一步在子载波域作相参积累,得到距离-多普勒2维谱。通过谱峰搜索和CLEAN技术[11]的运用,从中提取出峰值位置对应的时延和多普勒参数。将其作为初值,结合观测数据的似然函数,利用牛顿迭代法获得更精确的参数估计,本文称之为近似最大似然估计。近似最大似然估计量可构成复合假设检验的重要环节,提升检测器的目标检测性能。论文组织结构如下:第2节给出了相位编码OFDM信号的回波模型;第3节给出了目标距离和速度参数的最大似然估计模型;为了提高目标运动参数估计的运算效率,第4节提出一种基于通道分离的近似最大似然估计算法;第5节利用仿真实验验证了算法的性能;第6节是总结。
设雷达发射相位编码OFDM信号为:
sT(t)=ej2πfctN−1∑n=0u(t−nTr)rect(t−nTrTr) | (1) |
其中,
u(t)=K−1∑k=0M−1∑m=0ak,mej2πkΔftrect(t−mtctc) | (2) |
其中,K是载波个数,第k个子载波上的相位编码序列
对目标散射的回波进行下变频处理,可获得相应的OFDM基带信号:
sr(t)=A0e−j2πfctsT(t−τ) | (3) |
其中,
sr(t)≈N−1∑n=0K−1∑k=0M−1∑m=0A0ak,mrect(t−nTr−τ0Tr)⋅rect(t−nTr−τ0−mtctc)e−j2π(fc+kΔf)τ0⋅ej2πkΔftej2π(fd+fdk)te−j2πkΔfnTr | (4) |
其中,子载波间的多普勒频差
\begin{align} {s_{\rm r}}(\tilde t,n) =& \sum\limits_{k = 0}^{K - 1} \sum\limits_{m = 0}^{M - 1} {A_0}{a_{k,m}} {\rm{rect}}\left(\frac{{\tilde t - {\tau _0}}}{{{T_{\rm r}}}}\right)\\ & \cdot{\rm{rect}}\left( \frac{{\tilde t - {\tau _0} - m{t_{\rm c}}}}{{{t_{\rm c}}}}\right){{\rm{e}}^{{\rm{ - j}}2{\rm{{{π}} }}({f_{\rm c}} + k\Delta f){\tau _0}}}\\ & \cdot {{\rm{e}}^{{\,\rm{j}}2{\rm{{{π}} }}k\Delta f\tilde t} \;{{\rm{e}}^{{\,\rm{j}}2{\rm{{{π}} }}({f_{\rm d}} + {f_{{\rm d}k}}) (\tilde t + n{T_{\rm r}})}} \end{align} | (5) |
其中,第2个指数项表示不同子载波的回波信号具有不同的频率偏移;由第3个指数项可以看出,子载波分别与慢时间和快时间相耦合。
在加性高斯白噪声背景下,观测信号可以表示为:
y(˜t,n)=sr(˜t,n)+w(˜t,n) | (6) |
其中,
针对式(6)在快时间域采样,采样时刻为
sr(p,m,n)=K−1∑k=0A0ak,mrect(mtc+pTs−τ0Tr)⋅rect(pTs−τ0tc)e−j2π(fc+kΔf)τ0⋅ej2π(fd+kΔf)(mtc+pTs)ej2π(fd+fdk)nTr | (7) |
令
Y=AS+W | (8) |
其中,
记待估参数向量
p(Y|u)=N−1∏n=01πMK|R|−1e−(yn−Asn)HR−1(yn−Asn) | (9) |
式(8)是关于A的条件线性模型,其最大似然估计为:
ˆA=N−1∑n=0sHnR−1yn/N−1∑n=0sHnR−1sn | (10) |
将其代入式(9),则距离和速度参数的最大似然估计为:
[ˆR,ˆv]=argmin[R,v]N−1∑n=0(yn−ˆAsn)HR−1(yn−ˆAsn)=argmax[R,v]|N−1∑n=0sHnR−1yn|2/|N−1∑n=0sHnR−1yn|2N−1∑n=0sHnR−1snN−1∑n=0sHnR−1sn=(σ2KN)−1argmax[R,v]|N−1∑n=0K−1∑p=0M−1∑m=0K−1∑k=0y(p,m,n)⋅a∗k,mej2πkΔfτ0e−j2π(fd+kΔf)(mtc+pTs)⋅e−j2π(fd+fdk)nTr|2 | (11) |
针对该模型,通过由粗到精的网格搜索可以得到参数的估计值。记速度搜索范围为
令
CRLB(v)=c2σ2N/c2σ2N(32(πfcTrA0)2(32(πfcTrA0)2⋅[N2(N−1)(2N−1)/6−N2(N−1)2/4]) | (12) |
CRLB(R)=3c2σ2/(8(πΔfA0)2N(K2−1)) | (13) |
直接利用式(11)求最大似然估计时计算量较大,在此考虑提高运算效率的参数估计方法。式(11)中相位项
x(˜t,n,k)=M−1∑m=0A0ak,mrect(˜t−τ0Tr)⋅rect(˜t−τ0−mtctc)e−j2π(fc+kΔf)τ0⋅ej2πfd˜tej2π(fd+fdk)nTr | (14) |
其中,快时间域的采样时刻
用相位编码信号作为参考对式(14)作相关处理,得到第i个距离单元上的信号
zk(˜t,n,i)=M−1∑m,l=0A0ak,ma∗k,lrect(˜t−τ0Tr−|τ0−itc|)⋅rect(˜t−τ0˜t−|τ0−(i+l−m)tc|)⋅e−j2π(fc+kΔf)τ0ej2πfd˜tej2π(fd+fdk)nTr | (15) |
此时,信号的距离分辨单元
gk(˜t,h,i)=zk(˜t,fcfc+kΔfh,i)≈M−1∑m,l=0A0ak,ma∗k,lrect(˜t−τ0Tr−|τ0−itc|)⋅rect(˜t−τ0˜t−|τ0−(i+l−m)tc|)⋅e−j2π(fc+kΔf)τ0ej2πfd˜tej2πfdhTr | (16) |
经Keystone变换后的信号在慢时间域可能出现多普勒速度模糊现象(
gk(˜t,h,i)≈M−1∑m,l=0A0ak,ma∗k,lrect(˜t−τ0Tr−|τ0−itc|)⋅rect(˜t−τ0˜t−|τ0−(i+l−m)tc|)⋅e−j2π(fc+kΔf)τ0ej2πfd˜tej2π˜fdhTr⋅ej2πrfcfc+kΔfh | (17) |
经过折叠因子补偿后的信号
fk(˜t,h,i)=gk(˜t,h,i)e−j2πrfcfc+kΔfh | (18) |
对式(18)进行脉冲多普勒处理,即关于慢时间作相参积累,可得第k个子载波上的多普勒频谱
Fk(˜t,u,i)=M−1∑m,l=0A0ak,ma∗k,lrect(˜t−τ0Tr−|τ0−itc|)⋅rect(˜t−τ0˜t−|τ0−(i+l−m)tc|)⋅sin(πs)sin(πs/N)ejπN−1Nse−j2π(fc+kΔf)τ0⋅ej2πfd˜t | (19) |
其中,
进一步,关于k作相参积累可得距离-多普勒谱:
F(˜t,u,i)=M−1∑m,l=0A0ak,ma∗k,lrect(˜t−τ0Tr−|τ0−itc|)⋅rect(˜t−τ0˜t−|τ0−(i+l−m)tc|)⋅sin(πs)sin(πs/N)sin(πp)sin(πp/K)⋅ejπN−1NsejπK−1Kpe−j2πfcτ0ej2πfd˜t | (20) |
其中,
上述基于通道分离和Keystone相结合的方法是对式(11)模型的简化处理。为了进一步提高算法的估计精度,将得到的参数值作为初值,根据式(11)进行牛顿迭代,得到近似最大似然估计量。算法计算量约为
上述模型均是以单目标为例进行讨论的。在多目标情况,可以利用CFAR检测器在距离-多普勒谱域进行谱峰搜索,根据最大峰值对应的参数估计值重构相应的子信号
b(˜t,n)=K−1∑k=0ej2πkΔfˆτ0e−j2π(ˆfd+kΔf)˜te−j2π(ˆfd+ˆfdk)nTr | (21) |
利用CLEAN技术从原信号中减去重构的子信号,然后对剩余信号继续上述的操作,直到CFAR检测不出峰值为止。算法处理流程如图1所示。
仿真参数设置如表1所示。根据表中数据并结合前文的分析可知,目标产生速度模糊的阈值是
参数 | 数值 |
工作频率 ${f_{\rm c}}$ (GHz) | 5 |
带宽B (MHz) | 64 |
脉冲重复周期 ${T_{\rm r}}$ (ms) | 2 |
脉冲数N | 250 |
载波数K | 64 |
信号经过通道分离和相关处理后,在子载波-多普勒平面的投影如图2所示。由图2可以看出,目标的多普勒频移随子载波的变化而变化,因此不能直接进行子载波域的相参积累。
经过Keystone变换和CLEAN处理后,相应子载波-多普勒平面的投影如图3所示。可见,多普勒频移与子载波之间的耦合得到校正。
对Keystone变换的数据在子载波域进行相参积累,获得信号的距离-多普勒2维谱如图4所示。其中图4(a)是聚焦于第1个目标的结果,图4(b)是剔除前两个目标后聚焦于第3个目标的结果。结果表明,本文所述的补偿方法能在快时间域、慢时间域以及子载波域将目标的回波能量积累起来,有利于目标检测和后续的参数估计。
下面考察近似最大似然估计量的性能。当
由图5(a)可知,文献[8](点划线)与基于通道分离和Keystone相结合的方法(实线)的速度估计的RMSE曲线变化趋势类似。在相同的估计精度下,与上述两种方法相比,本文基于通道分离的近似最大似然估计法(点线)的输入信噪比
本文将OFDM通信信号应用在雷达动目标探测中,在通信、雷达一体化的发展背景下有着重要的应用前景。本文参考多载频MIMO雷达通道分离得到目标高分辨距离信息的方法,将OFDM信号的多载波正交结构与脉冲多普勒处理相结合,并借助Keystone变换解决了多普勒偏移问题。为了得到更好的估计精度,利用牛顿迭代法对似然函数进行优化,得到了基于通道分离的近似最大似然估计方法。仿真结果验证了算法的综合性能。今后还可针对机动目标相干化处理以及参数估计问题展开研究,扩展OFDM雷达的应用范围。
[1] |
KRIZHEVSKY A, SUTSKEVER I, and HINTON G E. ImageNet classification with deep convolutional neural networks[C]. The 25th International Conference on Neural Information Processing Systems, Lake Tahoe, USA, 2012: 1006–1114.
|
[2] |
SIMONYAN K and ZISSERMAN A. Very deep convolutional networks for large-scale image recognition[C]. 3rd International Conference on Learning Representations, San Diego, USA, 2015: 1–14. doi: 10.48550/arXiv.1409.1556.
|
[3] |
HE Kaiming, ZHANG Xiangyu, REN Shaoqing, et al. Deep residual learning for image recognition[C]. 2016 IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, USA, 2016: 770–778. doi: 10.1109/CVPR.2016.90.
|
[4] |
TAN Mingxing and LE Q. EfficientNet: Rethinking model scaling for convolutional neural networks[C]. The 36th International Conference on Machine Learning, Long Beach, USA, 2019: 6105–6114.
|
[5] |
LIU Ze, LIN Yutong, CAO Yue, et al. Swin transformer: Hierarchical vision transformer using shifted windows[C]. 2021 IEEE/CVF International Conference on Computer Vision, Montreal, Canada, 2021: 9992–10002. doi: 10.1109/ICCV48922.2021.00986.
|
[6] |
DOSOVITSKIY A, BEYER L, KOLESNIKOV A, et al. An image is worth 16x16 words: Transformers for image recognition at scale[C]. 9th International Conference on Learning Representations, 2021: 1−22. https://iclr.cc/virtual/2021/index.html.
|
[7] |
CHEN Sizhe, WANG Haipeng, XU Feng, et al. Target classification using the deep convolutional networks for SAR images[J]. IEEE Transactions on Geoscience and Remote Sensing, 2016, 54(8): 4806–4817. doi: 10.1109/TGRS.2016.2551720.
|
[8] |
喻玲娟, 王亚东, 谢晓春, 等. 基于FCNN和ICAE的SAR图像目标识别方法[J]. 雷达学报, 2018, 7(5): 622–631. doi: 10.12000/JR18066.
YU Lingjuan, WANG Yadong, XIE Xiaochun, et al. SAR ATR based on FCNN and ICAE[J]. Journal of Radars, 2018, 7(5): 622–631. doi: 10.12000/JR18066.
|
[9] |
赵鹏菲, 黄丽佳. 一种基于EfficientNet与BiGRU的多角度SAR图像目标识别方法[J]. 雷达学报, 2021, 10(6): 895–904. doi: 10.12000/JR20133.
ZHAO Pengfei and HUANG Lijia. Target recognition method for multi-aspect synthetic aperture radar images based on EfficientNet and BiGRU[J]. Journal of Radars, 2021, 10(6): 895–904. doi: 10.12000/JR20133.
|
[10] |
HUANG Xiayuan, YANG Qiao, and QIAO Hong. Lightweight two-stream convolutional neural network for SAR target recognition[J]. IEEE Geoscience and Remote Sensing Letters, 2021, 18(4): 667–671. doi: 10.1109/LGRS.2020.2983718.
|
[11] |
LIU Jiaming, XING Mengdao, YU Hanwen, et al. EFTL: Complex convolutional networks with electromagnetic feature transfer learning for SAR target recognition[J]. IEEE Transactions on Geoscience and Remote Sensing, 2022, 60: 5209811. doi: 10.1109/TGRS.2021.3083261.
|
[12] |
ZHANG Tianwen, ZHANG Xiaoling, KE Xiao, et al. HOG-ShipCLSNet: A novel deep learning network with HOG feature fusion for SAR ship classification[J]. IEEE Transactions on Geoscience and Remote Sensing, 2022, 60: 5210322. doi: 10.1109/TGRS.2021.3082759.
|
[13] |
QOSJA D, WAGNER S, and BRÜGGENWIRTH S. Benchmarking convolutional neural network backbones for target classification in SAR[C]. 2023 IEEE Radar Conference, San Antonio, USA, 2023: 1–6. doi: 10.1109/RadarConf2351548.2023.10149802.
|
[14] |
LIU Zhuang, MAO Hanzi, WU Chaoyuan, et al. A ConvNet for the 2020s[C]. 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, USA, 2022: 11966–11976. doi: 10.1109/CVPR52688.2022.01167.
|
[15] |
张翼鹏, 卢东东, 仇晓兰, 等. 基于散射点拓扑和双分支卷积神经网络的SAR图像小样本舰船分类[J]. 雷达学报, 2024, 13(2): 411–427. doi: 10.12000/JR23172.
ZHANG Yipeng, LU Dongdong, QIU Xiaolan, et al. Few-shot ship classification of SAR images via scattering point topology and dual-branch convolutional neural network[J]. Journal of Radars, 2024, 13(2): 411–427. doi: 10.12000/JR23172.
|
[16] |
LUAN Shangzhen, CHEN Chen, ZHANG Baochang, et al. Gabor convolutional networks[J]. IEEE Transactions on Image Processing, 2018, 27(9): 4357–4366. doi: 10.1109/TIP.2018.2835143.
|
[17] |
徐丰, 金亚秋. 微波视觉与SAR图像智能解译[J]. 雷达学报, 2024, 13(2): 285–306. doi: 10.12000/JR23225.
XU Feng and JIN Yaqiu. Microwave vision and intelligent perception of radar imagery[J]. Journal of Radars, 2024, 13(2): 285–306. doi: 10.12000/JR23225.
|
[18] |
GERRY M J, POTTER L C, GUPTA I J, et al. A parametric model for synthetic aperture radar measurements[J]. IEEE Transactions on Antennas and Propagation, 1999, 47(7): 1179–1188. doi: 10.1109/8.785750.
|
[19] |
POTTER L C and MOSES R L. Attributed scattering centers for SAR ATR[J]. IEEE Transactions on Image Processing, 1997, 6(1): 79–91. doi: 10.1109/83.552098.
|
[20] |
李飞. 雷达图像目标特征提取方法研究[D]. [博士论文], 西安电子科技大学, 2014.
LI Fei. Study on target feature extraction based on radar image[D]. [Ph.D. dissertation], Xidian University, 2014.
|
[21] |
ROSS T D, WORRELL S W, VELTEN V J, et al. Standard SAR ATR evaluation experiments using the MSTAR public release data set[C]. SPIE 3370, Algorithms for Synthetic Aperture Radar Imagery V, Orlando, USA, 1998: 566–573. doi: 10.1117/12.321859.
|
[22] |
HUANG Lanqing, LIU Bin, LI Boying, et al. OpenSARShip: A dataset dedicated to sentinel-1 ship interpretation[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2018, 11(1): 195–208. doi: 10.1109/JSTARS.2017.2755672.
|
[23] |
SUN Yongguang, DU Lan, WANG Yan, et al. SAR automatic target recognition based on dictionary learning and joint dynamic sparse representation[J]. IEEE Geoscience and Remote Sensing Letters, 2016, 13(12): 1777–1781. doi: 10.1109/LGRS.2016.2608578.
|
[24] |
DENG Sheng, DU Lan, LI Chen, et al. SAR automatic target recognition based on Euclidean distance restricted autoencoder[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2017, 10(7): 3323–3333. doi: 10.1109/JSTARS.2017.2670083.
|
[25] |
NI Jiacheng and XU Yuelei. SAR automatic target recognition based on a visual cortical system[C]. 2013 6th International Congress on Image and Signal Processing, Hangzhou, China, 2013: 778–782. doi: 10.1109/CISP.2013.6745270.
|
[26] |
LI Yi, DU Lan, and WEI Di. Multiscale CNN based on component analysis for SAR ATR[J]. IEEE Transactions on Geoscience and Remote Sensing, 2022, 60: 5211212. doi: 10.1109/TGRS.2021.3100137.
|
[1] | CAO Jingyi, ZHANG Yang, YOU Ya’nan, WANG Yamin, YANG Feng, REN Weijia, LIU Jun. Target Recognition Method Based on Graph Structure Perception of Invariant Features for SAR Images[J]. Journal of Radars, 2025, 14(2): 366-388. doi: 10.12000/JR24125 |
[2] | RUAN Hang, CUI Jiahao, MAO Xiuhua, REN Jianying, LUO Binyan, CAO Hang, LI Haifeng. A Survey of Adversarial Attacks on SAR Target Recognition: From Digital Domain to Physical Domain[J]. Journal of Radars, 2024, 13(6): 1298-1326. doi: 10.12000/JR24142 |
[3] | JIA Hecheng, PU Xinyang, WANG Yanni, FU Shilei, XU Feng. Multi-view Sample Augumentation for SAR Based onDifferentiable SAR Renderer[J]. Journal of Radars, 2024, 13(2): 457-470. doi: 10.12000/JR24011 |
[4] | HUANG Zhongling, WU Chong, YAO Xiwen, WANG Lipeng, HAN Junwei. Physically Explainable Intelligent Perception and Application of SAR Target Characteristics Based on Time-frequency Analysis[J]. Journal of Radars, 2024, 13(2): 331-344. doi: 10.12000/JR23191 |
[5] | ZHANG Yipeng, LU Dongdong, QIU Xiaolan, LI Fei. Few-shot Ship Classification of SAR Images via Scattering Point Topology and Dual-branch Convolutional Neural Network[J]. Journal of Radars, 2024, 13(2): 411-427. doi: 10.12000/JR23172 |
[6] | WAQI Riti, LI Gang, ZHAO Zhichun, ZE Zhenghua. Feature Selection Method of Radar-based Road Target Recognition via Histogram Analysis and Adaptive Genetics[J]. Journal of Radars, 2023, 12(5): 1014-1030. doi: 10.12000/JR22245 |
[7] | LI Yi, DU Lan, DU Yuang. Convolutional Neural Network Based on Feature Decomposition for Target Detection in SAR Images[J]. Journal of Radars, 2023, 12(5): 1069-1080. doi: 10.12000/JR23004 |
[8] | LYU Xiaoling, QIU Xiaolan, YU Wenming, XU Feng. Simulation-assisted SAR Target Classification Based on Unsupervised Domain Adaptation and Model Interpretability Analysis[J]. Journal of Radars, 2022, 11(1): 168-182. doi: 10.12000/JR21179 |
[9] | LYU Yixuan, WANG Zhirui, WANG Peijin, LI Shengyang, TAN Hong, CHEN Kaiqiang, ZHAO Liangjin, SUN Xian. Scattering Information and Meta-learning Based SAR Images Interpretation for Aircraft Target Recognition[J]. Journal of Radars, 2022, 11(4): 652-665. doi: 10.12000/JR22044 |
[10] | DUAN Keqing, LI Xiang, XING Kun, WANG Yongliang. Clutter Mitigation in Space-based Early Warning Radar Using a Convolutional Neural Network[J]. Journal of Radars, 2022, 11(3): 386-398. doi: 10.12000/JR21161 |
[11] | SUN Hao, CHEN Jin, LEI Lin, JI Kefeng, KUANG Gangyao. Adversarial Robustness of Deep Convolutional Neural Network-based Image Recognition Models: A Review[J]. Journal of Radars, 2021, 10(4): 571-594. doi: 10.12000/JR21048 |
[12] | Su Ningyuan, Chen Xiaolong, Guan Jian, Mou Xiaoqian, Liu Ningbo. Detection and Classification of Maritime Target with Micro-motion Based on CNNs[J]. Journal of Radars, 2018, 7(5): 565-574. doi: 10.12000/JR18077 |
[13] | Zhao Juanping, Guo Weiwei, Liu Bin, Cui Shiyong, Zhang Zenghui, Yu Wenxian. Convolutional Neural Network-based SAR Image Classification with Noisy Labels[J]. Journal of Radars, 2017, 6(5): 514-523. doi: 10.12000/JR16140 |
[14] | Kang Miao, Ji Kefeng, Leng Xiangguang, Xing Xiangwei, Zou Huanxin. SAR Target Recognition with Feature Fusion Based on Stacked Autoencoder[J]. Journal of Radars, 2017, 6(2): 167-176. doi: 10.12000/JR16112 |
[15] | Zhao Feixiang, Liu Yongxiang, Huo Kai. Radar Target Recognition Based on Stacked Denoising Sparse Autoencoder[J]. Journal of Radars, 2017, 6(2): 149-156. doi: 10.12000/JR16151 |
[16] | Zhang Xinzheng, Tan Zhiying, Wang Yijian. SAR Target Recognition Based on Multi-feature Multiple Representation Classifier Fusion[J]. Journal of Radars, 2017, 6(5): 492-502. doi: 10.12000/JR17078 |
[17] | Ding Baiyuan, Wen Gongjian, Yu Liansheng, Ma Conghui. Matching of Attributed Scattering Center and Its Application to Synthetic Aperture Radar Automatic Target Recognition[J]. Journal of Radars, 2017, 6(2): 157-166. doi: 10.12000/JR16104 |
[18] | Wang Siyu, Gao Xin, Sun Hao, Zheng Xinwei, Sun Xian. An Aircraft Detection Method Based on Convolutional Neural Networks in High-Resolution SAR Images[J]. Journal of Radars, 2017, 6(2): 195-203. doi: 10.12000/JR17009 |
[19] | Tian Zhuangzhuang, Zhan Ronghui, Hu Jiemin, Zhang Jun. SAR ATR Based on Convolutional Neural Network[J]. Journal of Radars, 2016, 5(3): 320-325. doi: 10.12000/JR16037 |
[20] | Ding Bai-yuan, Zhong Jin-rong, Ma Cong-hui, Wen Gong-jian. SAR Target Reconstruction Visualization Enhancement Based on Attributed Scattering Center Model[J]. Journal of Radars, 2013, 2(4): 499-506. doi: 10.3724/SP.J.1300.2013.13071 |
参数 | 数值 |
工作频率 ${f_{\rm c}}$ (GHz) | 5 |
带宽B (MHz) | 64 |
脉冲重复周期 ${T_{\rm r}}$ (ms) | 2 |
脉冲数N | 250 |
载波数K | 64 |