DenseHillNet: a lightweight CNN for accurate classification of natural images

PeerJ Comput Sci. 2024 Apr 22:10:e1995. doi: 10.7717/peerj-cs.1995. eCollection 2024.

Abstract

The detection of natural images, such as glaciers and mountains, holds practical applications in transportation automation and outdoor activities. Convolutional neural networks (CNNs) have been widely employed for image recognition and classification tasks. While previous studies have focused on fruits, land sliding, and medical images, there is a need for further research on the detection of natural images, particularly glaciers and mountains. To address the limitations of traditional CNNs, such as vanishing gradients and the need for many layers, the proposed work introduces a novel model called DenseHillNet. The model utilizes a DenseHillNet architecture, a type of CNN with densely connected layers, to accurately classify images as glaciers or mountains. The model contributes to the development of automation technologies in transportation and outdoor activities. The dataset used in this study comprises 3,096 images of each of the "glacier" and "mountain" categories. Rigorous methodology was employed for dataset preparation and model training, ensuring the validity of the results. A comparison with a previous work revealed that the proposed DenseHillNet model, trained on both glacier and mountain images, achieved higher accuracy (86%) compared to a CNN model that only utilized glacier images (72%). Researchers and graduate students are the audience of our article.

Keywords: AI; CNN; Classification; Alrothim and analysis; CNN; Computer aided design; Computer network & communication; DL; DenseHillNet; ML.

Grants and funding

This work was supported by the Princess Nourah bint Abdulrahman University Researchers Supporting Project Number (PNURSP2024R235), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.