Reports of the Technical Conference of the Institute of Image Electronics Engineers of Japan
Online ISSN : 2758-9218
Print ISSN : 0285-3957
Reports of the 308th Technical Conference of the Institute of Image Electronics Engineers of Japan
Session ID : 23-04-077
Conference information

Generate AI texture Using the Controlnet model of stable diffusion to create stylized materials in Blender
*Yudong ZHANGTakanori NAGAE
Author information
CONFERENCE PROCEEDINGS RESTRICTED ACCESS

Details
Abstract
This work is a media art piece centered around stylized materials. With the advancement of AI painting technology, texture creation has seen a surge in possibilities and efficiency. Utilizing UV projection from the camera in 3D software opens up various methods for stylization. The process begins with modeling in Blender, where information such as depth and wireframes is extracted from objects within the scene. Subsequently, the Controlnet model of AI painting is used to generate corresponding textures. Camera projected UV mapping is carried out in Blender's Eevee renderer, followed by adjustments and hand drawn details in Photoshop. The modified textures are connected as base colors to a cartoon shader, resulting in a Celluloid (Cel) style material.
Content from these authors
© 2024 by The Institute of Image Electronics Engineers of Japan
Previous article Next article
feedback
Top