Paper The following article is Open access

Trip-GhostNet for Hyperspectral Image Classification

, , , and

Published under licence by IOP Publishing Ltd
, , Citation Zitong Zhang et al 2021 J. Phys.: Conf. Ser. 2024 012006 DOI 10.1088/1742-6596/2024/1/012006

1742-6596/2024/1/012006

Abstract

The classification of hyperspectral remote sensing images (HSI) is an important task in the processing and application of hyperspectral images. Convolutional neural network (CNN) has significant advantages in the extraction and fusion of hyperspectral image spectrum and spatial features, so it has become a common method in the field of HSI classification. However, the spectral domain feature redundancy of hyperspectral images is very high, and the spatial domain feature structure is complex, which makes CNN more time-consuming and higher memory requirements in the feature extraction process. Based on this problem, lightweight networks with fewer parameters have gradually become the main application method in the field of HSI classification. In order to reduce the computational complexity of deep feature extraction and take into account the shallow features, a high-order nonlinear Ghost module is proposed on the basis of the original Ghost linear transformation module. Furthermore, in view of the independent characteristics of each dimension of HSI, a Trip-GhostNet is proposed, which simultaneously extracts and fuses features from three dimensions in a lightweight manner. According to the distribution characteristics of HSI, the influence of attention embedding methods in the high-order Ghost module of each branch on the feature extraction is compared and analyzed. The results show that the proposed model can reduce model calculations and improve classification accuracy, and is suitable for HSI classification problems.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1742-6596/2024/1/012006