GenAI watercolor
When I learned costume design rendering in the 1990s, my professors created designs predominantly with pencil and watercolor, so that is the method I was taught. I knew people at other schools learned methods using gouache or art markers or alcohol inks—I feel like you learned whatever art-making medium your professor preferred. I know we also did a project using collage to create a set of renderings from a pile of magazines, but the emphasis was on mastering watercolor as a means of conveying your creative ideas.
I admit, I never felt confident in that medium, although I did get passable enough. I spent much of my years of study, struggling with mastering the watercolor medium and figure drawing, with forays into construction techniques, like pattern drafting, fabric dyeing, millinery, etc., and almost no experience in conducting fittings with actors, tracking show budgets, generating documents like pieces lists or costume plots, or communicating ideas to directors and other members of a creative team.
In the course of exploring possible applications for diffusion models and image generators, I find myself questioning the very nature of teaching costume design and why it’s taught the way it is. Of course, what do I know, it may no longer be taught the way it was 30 years ago. And it seems there will be a seismic shift in how it will be taught moving forward.
All that is a lot of nostalgic navel-gazing to preface why I decided to see how Copilot might generate watercolor sketches. For the exercise, I decided to go with something simple: a young Caucasian man in a red T-shirt and blue jeans. Here are the first couple of attempts:
Comments
Post a Comment