Underwater Image Enhancement using GAN


Authors : Sabnam Jebin C.; Rahamathulla K

Volume/Issue : Volume 9 - 2024, Issue 6 - June


Google Scholar : https://tinyurl.com/mwjnuj8a

Scribd : https://tinyurl.com/mrbfjfsa

DOI : https://doi.org/10.38124/ijisrt/IJISRT24JUN403

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.


Abstract : The process of enhancing the distorted underwater images to clear image is known as Underwater image enhance- ment. Distorted images are the raw underwater images that taken from the deep portion of ocean, river etc by using different cameras. In general underwater images are mainly used in underwater robotics, ocean pasture and environmental monitor- ing, ocean exploration etc. The underwater image enhancement process is done by using underwater image dataset which includes the distorted images (raw underwater images) and the corre- sponding enhanced underwater images. Currently used image enhancement methods cannot provide sufficient satisfaction to the underwater image enhancement. So proposed a new method by using Generative Adversarial Network (GAN), which tries to produce more images from the dataset.

Keywords : Generative Adversarial Network(GAN), Under- Water Image Enhancement.

References :

  1. K. Panetta, L. Kezebou, V. Oludare, and S. Agaian, “Comprehensive underwater object tracking benchmark dataset and underwater image enhancement with gan,” IEEE Journal of Oceanic Engineering, vol. 47, no. 1, pp. 59–75, 2021.
  2. M. J. Islam, Y. Xia, and J. Sattar, “Fast underwater image enhancement for improved visual perception,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 3227–3234, 2020.
  3. Y. Guo, H. Li, and P. Zhuang, “Underwater image enhancement using a multiscale dense generative adversarial network,” IEEE Journal of Oceanic Engineering, vol. 45, no. 3, pp. 862–870, 2019.
  4. P. Hambarde, S. Murala, and A. Dhall, “Uw-gan: Single-image depth estimation and image enhancement for underwater images,” IEEE Trans- actions on Instrumentation and Measurement, vol. 70, pp. 1–12, 2021.
  5. Dudhane, P. W. Patil, and S. Murala, “An end-to-end network for image de-hazing and beyond,” IEEE Transactions on Emerging Topics in Computational Intelligence, 2020.
  6. S. Wu, T. Luo, G. Jiang, M. Yu, H. Xu, Z. Zhu, and Y. Song, “A two-stage underwater enhancement network based on structure decom- position and characteristics of underwater imaging,” IEEE Journal of Oceanic Engineering, vol. 46, no. 4, pp. 1213–1227, 2021.
  7. S. Liu, H. Fan, S. Lin, Q. Wang, N. Ding, and Y. Tang, “Adaptive learning attention network for underwater image enhancement,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 5326–5333, 2022.
  8. C.-Y. Li, J.-C. Guo, R.-M. Cong, Y.-W. Pang, and B. Wang, “Underwater image enhancement by dehazing with minimum information loss and histogram distribution prior,” IEEE Transactions on Image Processing, vol. 25, no. 12, pp. 5664–5677, 2016.
  9. Li, C. Guo, W. Ren, R. Cong, J. Hou, S. Kwong, and D. Tao, “An underwater image enhancement benchmark dataset and beyond,” IEEE Transactions on Image Processing, vol. 29, pp. 4376–4389, 2019.
  10. M. Han, Z. Lyu, T. Qiu, and M. Xu, “A review on intelligence dehazing and color restoration for underwater images,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 50, no. 5, pp. 1820–1832, 2018.
  11. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative adversarial networks,” Communications of the ACM, vol. 63, no. 11, pp. 139–144, 2020.
  12. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, and A. Rabinovich, “Going deeper with convolutions,” in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1–9, 2015.
  13. O. Ronneberger, P. Fischer, and T. Brox, “U-net: Convolutional networks for biomedical image segmentation,” in International Conference on Medical image computing and computer-assisted intervention, pp. 234– 241, Springer, 2015.
  14. Fabbri, M. J. Islam, and J. Sattar, “Enhancing underwater imagery using generative adversarial networks,” in 2018 IEEE international conference on robotics and automation (ICRA), pp. 7159–7165, IEEE, 2018.
  15. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE transactions on image processing, vol. 13, no. 4, pp. 600–612, 2004.
  16. Ignatov, N. Kobyshev, R. Timofte, K. Vanhoey, and L. Van Gool, “Dslr-quality photos on mobile devices with deep convolutional net- works,” in Proceedings of the IEEE international conference on com- puter vision, pp. 3277–3285, 2017.
  17. Y.-S. Chen, Y.-C. Wang, M.-H. Kao, and Y.-Y. Chuang, “Deep photo enhancer: Unpaired learning for image enhancement from photographs with gans,” in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 6306–6314, 2018.

The process of enhancing the distorted underwater images to clear image is known as Underwater image enhance- ment. Distorted images are the raw underwater images that taken from the deep portion of ocean, river etc by using different cameras. In general underwater images are mainly used in underwater robotics, ocean pasture and environmental monitor- ing, ocean exploration etc. The underwater image enhancement process is done by using underwater image dataset which includes the distorted images (raw underwater images) and the corre- sponding enhanced underwater images. Currently used image enhancement methods cannot provide sufficient satisfaction to the underwater image enhancement. So proposed a new method by using Generative Adversarial Network (GAN), which tries to produce more images from the dataset.

Keywords : Generative Adversarial Network(GAN), Under- Water Image Enhancement.

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe