SketchUp Diffusion: Share your Feedback

Tell us what you think of SketchUp Diffusion!
This survey will take approximately 5 minutes of your time and will help ensure that our team remains focused on the things that matter to you.

1 Like

Diffusion has the ability to catapult SketchUp to greater heights, but for that to happen, it needs to stay true to the materials that have already been applied in the model. It veers a little too far and seems to have a mind of its own (probably an AI thing!) It just needs a budge in the right direction. I am sure you guys can do that, because i feel Diffusion can be of great help, especially when running on tight schedules. Thanks for gifting us this magical tool, nonetheless.


Hey Devenshah,

do you mean it should keep the same exact material, or that it should keep the material type but can change the style, or it should keep the material type + colors…?



like there is a cursor for geometry, there could be a cursor for that, from a “do what you cant” level to a “please respect the material I’ve set”

I agree that for the moment it’s interresting but beyond a quick sketch, I find it hard to actually see a commercial use for it : if I’m being paid to design a table made of oak and black steel, and I apply and adjust the proper materials, I don’t want diffusion to show me what it would look like in pine and aluminium.

having a choice like the geometry would give a similar control over the result, whether it’s a simple idea sketch or a detailed 3d.
no doubt that with time, diffusion will be better and better at understanding geometry and respecting it, but that could actually make it a viable tool for me


Last week I had tried to get the Image Creator AI to generate a picture of a Pernese dragon based upon the cover artwork of Michael Whelan. (FYI, the dragons of the fictional planet Pern are not reptiles, do not have scales and do not have spikey protrusions all over themselves. They have hide that feels like thick velvet.)

I specifically told it no scales, no spikey protrusions, but in every single image it created the dragons had scales and spikey things all over them.

I gave up. It’s not A.I., … it’s A.S. (Artificial Stupidity)

REF: Bing Images - art michael whelan dragon pern

1 Like

In my limited experimentation with diffusion I too was distressed by what it did with materials. I told it the model was a kitchen with cherry wood cabinets and black soapstone countertops. The countertops were rendered white regardless of what I tried and the cabinets got bizarre black and grey patchwork. Not impressed!

1 Like

Yes, it should keep the same materials and maybe, provide options alternatively.


I agree with you wholeheartedly. I’ve spent several hours so far with different prompts (even had ChatGPT give me better prompts) and AI still doesn’t come close to matching my materials. It’s frustrating, but, hopefully, the team can get this figured out so our renders can be true to our models.


A bulk image exporter would be very useful … and an option to export out images to a chosen folder…

its a wonderful play night for me.

1 Like


Been digging deep to see what Chatgpt came back with on recommendation to match diffusion colour with model color…

As an AI language model, I don’t have the capability to directly interact with or visualize models from SketchUp or any other 3D modeling software. However, I can provide guidance on how you might use a stable diffusion generative AI, like a plugin or extension, to recognize and render specific colors in a SketchUp model.

  1. Identify the Colors in SketchUp: First, ensure that the colors you mentioned (Color A01, Color C06, Color E04, Color H02, and Color I02) are defined in your SketchUp model. These colors need to be accurately applied to the respective surfaces or materials in your model.

  2. Stable Diffusion Generative AI Integration: If you’re using a generative AI plugin or extension that integrates with SketchUp, it should be capable of analyzing the model and identifying the colors applied. Make sure the AI tool you’re using is compatible with SketchUp and can process the model data effectively.

  3. Color Recognition Settings: In the settings of your AI tool, there might be options to specify how it recognizes and processes colors. Ensure that the tool is configured to accurately identify and maintain the colors you’ve applied in SketchUp.

  4. Rendering with AI: When you’re ready to render, the AI tool should process the model and generate images. It’s important that the tool is set to retain the original colors as specified in your model. Check if there are any specific settings or parameters in the AI tool that control color fidelity during rendering.

  5. Quality Checks: After rendering, compare the output with your original SketchUp model to ensure color accuracy. If there are discrepancies, you may need to adjust the settings in your AI tool or revisit how colors are applied in your SketchUp model.

  6. Consult Documentation or Support: If you’re using a specific stable diffusion generative AI tool, consult its documentation for detailed instructions on color processing. If issues persist, reaching out to the support team of the tool can provide more tailored assistance.

I’ve only been playing with SketchUp Diffusion for a day so far, and it’s a little bit mind-boggling, like all of the other AI image generation tools. It’s astonishing, actually. And, like all of the other AI tools, it’s only the VERY beginning. We’re only a year (maybe two) into the wide-spread application of these tools, and growth from here will be exponential.

That said, we have a lot of questions at our office about if, when, and how we can use images generated from SU Diffusion (or other AI image generation tools, for that matter) in our paid work for clients:

  • From where are the sources of the image generations coming?

  • Are the artists who created the sources for the image generations granting rights to SketchUp or Trimble for their work to be used by others?

  • Are those artists being compensated for any of their work that is used as a source for the image generations?

  • Does SketchUp or Trimble have an Image Rights Policy that identifies the fair use of these images?

  • Can we legally use these images for anything that we want, including work that we are being paid to do by a client? And, if so, who then owns the rights to the images? Our organization as the designers generating these images, or the client paying for them?

There are so many questions that very few people have figured out yet, but they are really important to answer.


some answers are already online :

The underlying dataset for Stable Diffusion was the 2b English language label subset of LAION 5b LAION-5B: A NEW ERA OF OPEN LARGE-SCALE MULTI-MODAL DATASETS | LAION, a general crawl of the internet created by the German charity LAION.

Images created through Stable Diffusion Online are fully open source, explicitly falling under the CC0 1.0 Universal Public Domain Dedication

edit : but yeah, valid points. considering AIs have been trained on globally gathered / stolen IP, it’s a valid point.


It appears to only work in perspective mode. If scene is set to parallel projection it automatically changes it when diffusion is launched.
Would be potentially a helpful tool for generating underlays for plans/elevations but currently wont work for that–maybe added in future releases?

1 Like

It would be great if it had a prompt history. and if you could adjust and save the view in the diffusion window

1 Like

Agree that Diffusion needs to better take cues from the surface textures already applied.

If I turn settings all the way up on respect model geometry and no prompt it still seems unable to understand that one surface is green, another is wood texture etc.

Would also be good if like some AIs it could build on one iteration instead of each render being a new attempt. For example being amble to enter prompts like 'good but a little more overhead lighting ’ or ‘add one more window to the background’ etc etc …iterations to get it right .

1 Like

it would be great and replace my use of enscape if the materials were a truer to what I had selected.

See workaround here:

1 Like

Diffusion gets easily confused. Timber frame trestle pony (heavy duty sawhorse) was turned into a weird cabinet/desk thingy and Diffusion decided to materialize some walls and flooring that were not part of the input view. I think this was the photo realistic interior setting.

I have tried running the settings sliders all the way to the right but gets lots of bizarre output with complex timber frame models.


Perhaps there is some natural confusion about what Diffusion really is. This is not a rendering engine, it does not calculate physical light off of objects in 3Dspace like a rendering engine nor does it create specific materials for any given surface.

This is more like a game you play with an alien artist who lives on the moon and watches the earth through a telescope. The AI has some vague ideas about what this planet is and how things work around here, but it doesn’t really know the “why” of what it has observed. You send off a letter (your prompt and input geometry) to the moon and describe a picture and then the artist paints it according to their own ideas and sends it back. The better you describe the painting you want the closer it can get but being an alien and not living in this world it’s also bound to get some fundamental things wrong from time to time. With these expectations, and properly used, diffusion is capable of some extraordinary things.


Its possible

1 Like