International Journal of Art Innovation and Development, 2025, 6(1); doi: 10.38007/IJAID.2025.060102.
Linlin Li
Nanjing Meishi Space Culture Communication Co., Ltd., Nanjing 210000, Jiangsu, China
To address the challenges of unstable generated image quality and insufficient detail reproduction in oil painting style transfer methods, this paper introduces an algorithm combining a generative adversarial network (GAN) and a convolutional neural network (CNN). This algorithm optimizes the extraction of oil painting style features and image detail control to realize high-quality oil painting style transfer. This paper investigates the extraction of image content features using a CNN and the capture of image style features using the Gram matrix. The optimized loss function for the generated image incorporates deep features at the texture level, enabling style transfer. The experiments utilize Python 3.7+ and the PyTorch/TensorFlow deep learning framework for image processing. Data shows that the texture detail score for both Impressionist-style and landscape-style images is 4.6, with minimal differences between the generated and target-style images, demonstrating excellent style transfer performance. Detail control metrics such as texture reconstruction quality, color saturation, and gradient smoothness are significantly improved during the transfer of the target oil painting style.
Generative Artificial Intelligence; Oil Painting Style Transfer; Detail Control; GNN; CNN
Linlin Li. Oil Painting Style Transfer Algorithm and Practice Based on Generative Artificial Intelligence. International Journal of Art Innovation and Development (2025), Vol. 6, Issue 1: 19-29. https://doi.org/10.38007/IJAID.2025.060102.
[1]Chang Y. Enhancing super resolution of oil painting patterns through optimization of UNet architecture model: Y. Chang[J]. Soft Computing, 2024, 28(2): 1295-1316.
[2]Bai S, Li P. Algorithm and simulation study of oil painting classification based on visual perception and improved embedded learning[J]. Journal of Intelligent & Fuzzy Systems, 2023, 45(6): 9979-9989.
[3]Liu X C, Wu Y C, Hall P. Painterly style transfer with learned brush strokes[J]. IEEE Transactions on Visualization and Computer Graphics, 2023, 30(9): 6309-6320.
[4]Wang Q, Guo C, Dai H N, et al. Stroke-GAN Painter: Learning to paint artworks using stroke-style generative adversarial networks[J]. Computational Visual Media, 2023, 9(4): 787-806.
[5]Zhang J R. Experimental Analysis of Style Transfer and Target Detection in Interactive Art on Smartphones[J]. IETE Journal of Research, 2024, 70(8): 6920-6931.
[6]Zhao M, Qian X Z, Song W. MA-GAN: the style transfer model based on multi-adaptive generative adversarial networks[J]. Journal of Electronic Imaging, 2024, 33(3): 033017-033017.
[7]Li D, Gao W. Neural style transfer based on deep feature synthesis[J]. The Visual Computer, 2023, 39(11): 5359-5373.
[8]Zhao M, Qian X Z, Song W. BcsUST: universal style transformation network for balanced content styles[J]. Journal of Electronic Imaging, 2023, 32(5): 053017-053017.
[9]Bai S, Li P. Algorithm and simulation study of oil painting classification based on visual perception and improved embedded learning[J]. Journal of Intelligent & Fuzzy Systems, 2023, 45(6): 9979-9989.
[10]Cheng J, Yang L, Tong S. Painting style and sentiment recognition using multi-feature fusion and style migration techniques[J]. Informatica, 2024, 48(21): 127-138.
[11]Liu X C, Wu Y C, Hall P. Painterly style transfer with learned brush strokes[J]. IEEE Transactions on Visualization and Computer Graphics, 2023, 30(9): 6309-6320.