SketchUp & AI - StableDiffusion, Automatic1111, Controlnet and segmentation colors

A week ago I stumbled on YouTube on a video about Controlnet - a technique to feed drawings/shapes/colors/masks as a sort of a guide for AI when generating images.Still totally new to this but the results are quite interesting for designers.

In short; generate a global 3d mass of your design in SketchUp, assign specific colors to the 3d model that reference a type of object (wall, floor, tree, sky etc), export as an image and just let the AI do its thing. I’m testing it to generate design options for inspiration. Sometimes you get stuck in the same design loop and this might help in opening up your mind by seeing some new combinations/colors/materials for your design.

You can interate really fast. Just add of change a few extra keywords on the prompt, switch to another dataset or just change some of the specific segmentation colors on the 3d model and the results can be totally different. Also, the AI takes just 10 to 30 seconds (depends on your gfx card) for a new 1024x512 image.

Interesting times!

You can also just trace over an image by hand, add some things and feed that ‘scribble’ to the AI.


Reflection and horizon height in the background look kind of strange as if things don’t match. Unless it’s a hilly environment. (third image)

You are totally right - in the first three images the house was exported from SketchUp and the horizon and trees were as segmentation colors added by hand in Gimp. The AI took it from there. Should have added a ground plane and sky in SketchUp

1 Like

Oh, this is awesome! And a bit scary.