Not too long after, I discovered a site called Artbreeder, which gave access to a pre-trained set of models. Mostly you could make your photos look like Van Gogh, or any other artist whose work the A.I. had been trained on. But I wanted to be able to train my own model. I wanted the computer to make new work based on my entire body of digital paintings.
That’s when I found RunwayML.
RunwayML’s aim was to make the training of new A.I. models accessible to artists, which is exactly what I was wishing for. And their site made the whole process super painless to hop into. Soon I was training models on my amoeba characters, some digital faces, and now, zombies.
My first experiment in the land of the A.I. undead, was to grab all the photos of zombies I could find from The Walking Dead. I trained a model on them, and it started giving me some pretty creepy, but awesome output. Soon, I realized that I would have to fix some of its attempts, and re-train it to include the newly altered imagery.
This idea turned to work extremely well.