2024 Volume 12 Issue 1 Pages 153-166
In photo-based line drawing rendering, existing algorithms often rely on edge information, leading to rendering that neglects the line density in real line drawings. This study proposes a Line Generative Adversarial Network (LineGAN) model to transform photos into manga-style line drawings by emphasizing line densities while minimizing the influence of edge information, resulting in more realistic line drawings. The LineGAN model coordinates global structure and local detail features, considering the spatial position and inter-channel contributions. This approach helps us to finely control the generation of lines, resulting in more artistic and expressive manga line effects. In addition, we collected an aligned dataset and developed two data-driven tools for creating interactive line drawings. Experiments have proven that LineGAN can produce superior manga line drawings compared to existing methods. Consequently, our model offers artists and designers an effective approach to craft high-quality manga-style line drawings.