Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Deep learning for typhoon intensity classification using satellite cloud images

Deep learning for typhoon intensity classification using satellite cloud images <jats:title>Abstract</jats:title><jats:p>Tropical cyclone, also known as typhoon, is one of the most destructive weather phenomena. Its intense cyclonic eddy circulations often cause serious damages to coastal areas. Accurate classification or prediction for typhoon intensity is crucial to the disaster warning and mitigation management. But typhoon intensity-related feature extraction is a challenging task as it requires significant pre-processing and human intervention for analysis, and its recognition rate is poor due to various physical factors such as tropical disturbance. In this study, we built a Typhoon-CNNs framework, an automatic classifier for typhoon intensity based on convolutional neural network (CNN). Typhoon-CNNs framework utilized a cyclical convolution strategy supplemented with dropout zero-set, which extracted sensitive features of existing spiral cloud band (SCB) more effectively and reduces over-fitting phenomenon. To further optimize the performance of Typhoon-CNNs, we also proposed the improved activation function (T-ReLU) and the loss function (CE-FMCE). The improved Typhoon-CNNs was trained and validated using more than 10,000 multiple sensor satellite cloud images of National Institute of Informatics. The classification accuracy reached to 88.74%. Compared with other deep learning methods, the accuracy of our improved Typhoon-CNNs was 7.43% higher than ResNet50, 10.27% higher than InceptionV3 and 14.71% higher than VGG16. Finally, by visualizing hierarchic feature maps derived from Typhoon-CNNs, we can easily identify the sensitive characteristics such as typhoon eyes, dense-shadowing cloud areas and SCBs, which facilitates classify and forecast typhoon intensity.</jats:p> http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Atmospheric and Oceanic Technology CrossRef

Deep learning for typhoon intensity classification using satellite cloud images

Journal of Atmospheric and Oceanic TechnologyJun 9, 2021

Deep learning for typhoon intensity classification using satellite cloud images


Abstract

<jats:title>Abstract</jats:title><jats:p>Tropical cyclone, also known as typhoon, is one of the most destructive weather phenomena. Its intense cyclonic eddy circulations often cause serious damages to coastal areas. Accurate classification or prediction for typhoon intensity is crucial to the disaster warning and mitigation management. But typhoon intensity-related feature extraction is a challenging task as it requires significant pre-processing and human intervention for analysis, and its recognition rate is poor due to various physical factors such as tropical disturbance. In this study, we built a Typhoon-CNNs framework, an automatic classifier for typhoon intensity based on convolutional neural network (CNN). Typhoon-CNNs framework utilized a cyclical convolution strategy supplemented with dropout zero-set, which extracted sensitive features of existing spiral cloud band (SCB) more effectively and reduces over-fitting phenomenon. To further optimize the performance of Typhoon-CNNs, we also proposed the improved activation function (T-ReLU) and the loss function (CE-FMCE). The improved Typhoon-CNNs was trained and validated using more than 10,000 multiple sensor satellite cloud images of National Institute of Informatics. The classification accuracy reached to 88.74%. Compared with other deep learning methods, the accuracy of our improved Typhoon-CNNs was 7.43% higher than ResNet50, 10.27% higher than InceptionV3 and 14.71% higher than VGG16. Finally, by visualizing hierarchic feature maps derived from Typhoon-CNNs, we can easily identify the sensitive characteristics such as typhoon eyes, dense-shadowing cloud areas and SCBs, which facilitates classify and forecast typhoon intensity.</jats:p>

Loading next page...
 
/lp/crossref/deep-learning-for-typhoon-intensity-classification-using-satellite-L5Wrbzt0PP
Publisher
CrossRef
ISSN
0739-0572
DOI
10.1175/jtech-d-19-0207.1
Publisher site
See Article on Publisher Site

Abstract

<jats:title>Abstract</jats:title><jats:p>Tropical cyclone, also known as typhoon, is one of the most destructive weather phenomena. Its intense cyclonic eddy circulations often cause serious damages to coastal areas. Accurate classification or prediction for typhoon intensity is crucial to the disaster warning and mitigation management. But typhoon intensity-related feature extraction is a challenging task as it requires significant pre-processing and human intervention for analysis, and its recognition rate is poor due to various physical factors such as tropical disturbance. In this study, we built a Typhoon-CNNs framework, an automatic classifier for typhoon intensity based on convolutional neural network (CNN). Typhoon-CNNs framework utilized a cyclical convolution strategy supplemented with dropout zero-set, which extracted sensitive features of existing spiral cloud band (SCB) more effectively and reduces over-fitting phenomenon. To further optimize the performance of Typhoon-CNNs, we also proposed the improved activation function (T-ReLU) and the loss function (CE-FMCE). The improved Typhoon-CNNs was trained and validated using more than 10,000 multiple sensor satellite cloud images of National Institute of Informatics. The classification accuracy reached to 88.74%. Compared with other deep learning methods, the accuracy of our improved Typhoon-CNNs was 7.43% higher than ResNet50, 10.27% higher than InceptionV3 and 14.71% higher than VGG16. Finally, by visualizing hierarchic feature maps derived from Typhoon-CNNs, we can easily identify the sensitive characteristics such as typhoon eyes, dense-shadowing cloud areas and SCBs, which facilitates classify and forecast typhoon intensity.</jats:p>

Journal

Journal of Atmospheric and Oceanic TechnologyCrossRef

Published: Jun 9, 2021

There are no references for this article.