A Low-Complexity Transformer-CNN Hybrid Model Combining Dynamic Attention for Remote Sensing Image Compression

dc.contributor.authorZhang, L. L.
dc.contributor.authorWang, X. J.
dc.contributor.authorLiu, J. H.
dc.contributor.authorFang, Q. Z.
dc.coverage.issue4cs
dc.coverage.volume33cs
dc.date.accessioned2025-04-04T12:26:47Z
dc.date.available2025-04-04T12:26:47Z
dc.date.issued2024-12cs
dc.description.abstractDeep learning-based methods have recently made enormous progress in remote sensing image compression. However, conventional CNN is complex to adaptively capture important information from different image regions. In addition, previous transformer-based compression methods have introduced high computational complexity to the models. Remote sensing images contain rich spatial and channel information. The effective extraction of these two kinds of information for image compression remains challenging. To address these issues, we propose a new low-complexity end-to-end image compression framework combining CNN and transformer. This framework includes two critical modules: the Dynamic Attention Model (DAM) and the Hyper-Prior Hybrid Attention Model (HPHAM). By employing dynamic convolution as the core part of the DAM, the DAM can dynamically adjust the attention weights according to the image content. HPHAM effectively integrates non-local and channel information of latent representations through the parallel running of Gated Channel Attention (GCA) and multi-head self-attention. Experiments demonstrate that the proposed approach outperforms existing mainstream deep-learning image compression approaches and conventional image compression methods, achieving optimal rate-distortion performance on three datasets. Code is available at https://github.com/jiahuiLiu11/LTCHM.en
dc.formattextcs
dc.format.extent642-659cs
dc.format.mimetypeapplication/pdfen
dc.identifier.citationRadioengineering. 2024 vol. 33, iss. 4, s. 642-659. ISSN 1210-2512cs
dc.identifier.doi10.13164/re.2024.0642en
dc.identifier.issn1210-2512
dc.identifier.urihttps://hdl.handle.net/11012/250813
dc.language.isoencs
dc.publisherRadioengineering societycs
dc.relation.ispartofRadioengineeringcs
dc.relation.urihttps://www.radioeng.cz/fulltexts/2024/24_04_0642_0659.pdfcs
dc.rightsCreative Commons Attribution 4.0 International licenseen
dc.rights.accessopenAccessen
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en
dc.subjectRemote sensing image compressionen
dc.subjectdynamic convolutionen
dc.subjectattention mechanismen
dc.subjectgating mechanismen
dc.titleA Low-Complexity Transformer-CNN Hybrid Model Combining Dynamic Attention for Remote Sensing Image Compressionen
dc.type.driverarticleen
dc.type.statusPeer-revieweden
dc.type.versionpublishedVersionen
eprints.affiliatedInstitution.facultyFakulta elektrotechniky a komunikačních technologiícs

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
24_04_0642_0659.pdf
Size:
7.3 MB
Format:
Adobe Portable Document Format

Collections