Employing Neural Style Transfer for Generating Deep Dream Images
Abstract
In recent years, deep dream and neural style transfer emerged as hot topics in deep learning. Hence, mixing those two techniques support the art and enhance the images that simulate hallucinations among psychiatric patients and drug addicts. In this study, our model combines deep dream and neural style transfer (NST) to produce a new image that combines the two technologies. VGG-19 and Inception v3 pre-trained networks are used for NST and deep dream, respectively. Gram matrix is a vital process for style transfer. The loss is minimized in style transfer while maximized in a deep dream using gradient descent for the first case and gradient ascent for the second. We found that different image produces different loss values depending on the degree of clarity of that images. Distorted images have higher loss values in NST and lower loss values with deep dreams. The opposite happened for the clear images that did not contain mixed lines, circles, colors, or other shapes.
Downloads
References
Abedi, W.M., Nadher, I., Sadiq, A.T. and Al., E., 2020. Modified deep learning method for body postures recognition. International Journal of Advanced Science and Technology, 29, pp.3830-3841.
Ali, L., Alnajjar, F., Jassmi, H.A, Gochoo, M., Khan, W. and Serhani, M.A., 2021. Performance evaluation of deep CNN-based crack detection and localization techniques for concrete structures. Sensors, 21, p.1688.
Alzubaidi, L., Zhang, J., Humaidi, A.J., Al-Dujaili, A., Duan, Y., Al-Shamma, O., Santamaría, J., Fadhel, M.A., Al-Amidie, M. and Farhan, L., 2021. Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. Journal of Big Data, 8, pp.1-74.
Cao, J., Yan, M., Jia, Y., Tian, X. and Zhang, Z., 2021. Application of a modified Inception-v3 model in the dynasty-based classification of ancient murals. EURASIP Journal on Advances in Signal Processing, 2021, p.49.
Chen, X., Zhang, Y., Wang, Y., Shu, H., Xu, C. and Xu, C., 2020. Optical flow distillation: Towards efficient and stable video style transfer. In: Lecture Notes in Computer Science (LNCS). Springer Science, Germany.
Choi, H.C., 2022. Toward exploiting second-order feature statistics for arbitrary image style transfer. Sensors(Basel), 2022, p.2611.
El-Rahiem, B.A., Amin, M., Sedik, A., Samie, F.E. and Iliyasu, A.M., 2022. An efficient multi-biometric cancellable biometric scheme based on deep fusion and deep dream. Journal of Ambient Intelligence and Humanized Computing, 13, pp.2177-2189.
Gatys, L.A., Ecker, A.S. and Bethge, M., 2015. A Neural Algorithm of Artistic Style. arXiv Prepr. arXiv1508.06576.
Khan, A., Sohail, A., Zahoora, U. and Qureshi, A.S., 2020. A survey of the recent architectures of deep convolutional neural networks. Artificial Intelligence Review, 53, pp.5455-5516.
Kiran, T.T., 2021. Deep inceptionism learning performance analysis using TensorFlow with GPU-deep dream algorithm. Journal of Emerging Technologies and Innovative Research, 8, pp.322-328.
Kotovenko, D., Sanakoyeu, A., Ma, P., Lang, S. and Ommer, B., 2019. A content transformation block for image style transfer. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition. IEEE, United States, pp. 10024-10033.
Li, H., 2018. A Literature Review of Neural Style Transfer. Princeton University Technical Report, Princeton NJ, p.085442019.
Mordvintsev, A., Olah, C. and Tyka, M., 2015. Inceptionism: Going Deeper into Neural Networks. Available from : https://googleresearch.blogspot.co.uk/2015/06/ inceptionism-going-deeper-into-neural.html [Last accessed on 2022 Aug 03].
Rashid, M., Khan, M.A., Alhaisoni, M., Wang, S.H., Naqvi, S.R., Rehman, A. and Saba, T., 2020. A sustainable deep learning framework for object recognition using multi-layers deep features fusion and selection. Sustain, 12, p.1-21.
Singh, A., Jaiswal, V., Joshi, G., Sanjeeve, A., Gite, S. and Kotecha, K., 2021. Neural style transfer: A critical review. IEEE Access, 9, pp.131583-131613.
Sudha, V. and Ganeshbabu, T.R., 2021. A convolutional neural network classifier VGG-19 architecture for lesion detection and grading in diabetic retinopathy based on deep learning. Computers, Materials and Continua, 66, pp.827-842.
Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J. and Wojna, Z., 2016. Rethinking the inception architecture for computer vision. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition. IEEE, United States, pp.2818-2826.
Tuyen, N.Q., Nguyen, S.T., Choi, T.J. and Dinh, V.Q., 2021. Deep correlation multimodal neural style transfer. IEEE Access, 9, p.141329-141338.
Wani, M.A., Bhat, F.A., Afzal, S. and Khan, A.I., 2020. Advances in Deep Learning. Springer Nature, Singapore.
Xiao, J., Wang, J., Cao, S. and Li, B., 2020. Application of a novel and improved VGG-19 network in the detection of workers wearing masks. Journal of Physics: Conference Series, 1518, 012041.
Yin, H., Molchanov, P., Alvarez, J.M., Li, Z., Mallya, A., Hoiem, D., Jha, N.K. and Kautz, J., 2020. Dreaming to distill: Data-free knowledge transfer via deepinversion. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, United States, pp.8712-8721.
Copyright (c) 2022 Lafta R. Al-Khazraji, Ayad R. Abbas, Abeer S. Jamil
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Authors who choose to publish their work with Aro agree to the following terms:
-
Authors retain the copyright to their work and grant the journal the right of first publication. The work is simultaneously licensed under a Creative Commons Attribution License [CC BY-NC-SA 4.0]. This license allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors have the freedom to enter into separate agreements for the non-exclusive distribution of the journal's published version of the work. This includes options such as posting it to an institutional repository or publishing it in a book, as long as proper acknowledgement is given to its initial publication in this journal.
-
Authors are encouraged to share and post their work online, including in institutional repositories or on their personal websites, both prior to and during the submission process. This practice can lead to productive exchanges and increase the visibility and citation of the published work.
By agreeing to these terms, authors acknowledge the importance of open access and the benefits it brings to the scholarly community.