A Novel Tree Model-based DNN to Achieve a~High-Resolution DOA Estimation via Massive MIMO Receive Array

No Thumbnail Available
Date
2024-12
Authors
Li, Y.
Shu, F.
Song, Y.
Wang, J.
ORCID
Advisor
Referee
Mark
Journal Title
Journal ISSN
Volume Title
Publisher
Radioengineering society
Altmetrics
Abstract
To satisfy the high-resolution requirements of direction-of-arrival (DOA) estimation, conventional deep neural network (DNN)-based methods using grid idea need to significantly increase the number of output classifications and also produce a huge high model complexity. To address this problem, a multi-level tree-based DNN model (TDNN) is proposed as an alternative , where each level takes small-scale multi-layer neural networks (MLNNs) as nodes to divide the target angular interval into multiple sub-intervals, and each output class is associated to a MLNN at the next level. Then the number of MLNNs is gradually increasing from the first level to the last level, and so increasing the depth of tree will dramatically raise the number of output classes to improve the estimation accuracy. More importantly, this network is extended to make a multi-emitter DOA estimation. Simulation results show that the proposed TDNN performs much better than conventional DNN and root multiple signal classification algorithm (root-MUSIC) at extremely low signal-to-noise ratio (SNR) with massive multiple input multiple output (MIMO) receive array, and can achieve Cramer-Rao lower bound (CRLB). Additionally, in the multi-emitter scenario, the proposed Q-TDNN has also made a substantial performance enhancement over DNN and Root-MUSIC, and this gain grows as the number of emitters increases.
Description
Citation
Radioengineering. 2024 vol. 33, iss. 4, s. 563-570. ISSN 1210-2512
https://www.radioeng.cz/fulltexts/2024/24_04_0563_0570.pdf
Document type
Peer-reviewed
Document version
Published version
Date of access to the full text
Language of document
en
Study field
Comittee
Date of acceptance
Defence
Result of defence
Document licence
Creative Commons Attribution 4.0 International license
http://creativecommons.org/licenses/by/4.0/
Collections
Citace PRO