SketchUp Diffusion: Share your Feedback

You’re asking something that’s hard/too difficult for the current state of SketchUpDiffusion.

Interiors can be a challenge for any SD program. In my experience, the AI works best if the objects that it has to ‘calculate’ are 100% in view. If you just have a fragment of a kitchen or a swimming pool, the AI most often has trouble ‘recognizing’ the objects and renders all kind of ‘creative’ solutions

You could have a look at other StableDiffusion solutions like ComfyUI or Automatic1111 and use inpainting to have more control over what you like the AI to change/render.

There should be a default setting for just rendering it with the materials and textures and shapes we already made. A large percentage of SketchUp users work in architecture and interior design, and we already know what materials we need to use for projects. We just want to be able to use AI to make them look more realistic since SketchUp itself doesn’t have an included rendering program. I also can’t use diffusion since it turns my cabinets into windows or skews something I modeled to specifically look a certain way. My renderings need to match my construction documents. I literally bring my SketchUp models into chief architect for my construction plans and elevations. The AI renderings don’t match the drawings when spikes are added and cabinet faces and textures change. When the user already knows how to model and apply textures (as is the purpose of SketchUp), then having all these grand AI alternate versions deviate from what the actual artist already created. AI can be useful outside of SketchUp for those who don’t know how to model and apply textures themselves. This isn’t the case for most SketchUp users though. We just want to render the scene we created with AI, and then have potential other creative options. The default option should be our own creation though.

Like all AI I have tried so far it’s a bit like asking a small child genius to create something for you. It’s done with good to sometimes even breathtaking technique but with the understanding, experience and maturity of an infant. I can’t see a serious use for this until it is much maturer.

What you are talking about would be a rendering engine. There are many to choose from in the market already. Diffusion is decidedly not a rendering engine, it’s not calculating light and surfaces or materials, UVs or bumpmaps, it’s not even really looking at the 3D data. It’s just taking 2D shapes from your screen and substituting it’s best guess based on other images it’s seen. It’s really an early stage brainstorming or iteration tool, not a render engine. It’s actually so good at it’s guessing/substitution job that it’s getting pretty close in some cases to what looks like a rendered image. Leading many to get excited at the idea of an “automatic” render engine, I get that, but it’s a far stretch really, as the technology is very different.

2 Likes

That’s because its training data was rendered images, and it learned how to imitate and paste together the kinds of things it saw there. It really has no idea what it is looking at or why it looks that way beyond the words and phrases in the descriptions that went with the training images.

1 Like

Perhaps the team’s description is misleading ? viz:

3 Likes

Every time I render something, it shows the lines of the model. I can’t find a setting to turn this off, although I have found that I can Hide the model, then turn the visibility of the axes off. Is there a way for it to automatically hide the model?

Also add my vote for having it respect the materials, or at least the colors, of the model.

I would also like to know what was used to train the AI. I put something out on Facebook, and an artist friend who is concerned about how AI is causing problems for artists because it was “trained” on their work suggested I ask. It makes me super concerned about infringing on someone else’s designs…

If you would do a simple search on the forum with “diffusion learning” You would find this:
https://help.sketchup.com/en/under-hood

This link was even placed earlier in this thread…

I’ve posted a few times. I have all the same thoughts about materials as others. There is something more mild that I think could be addressed.

When I add a scene to my image from my diffusion results (ipad) it is labeled “render”, which it is not. This might help alleviate a modicum of confusion.

And then some images for fun :slight_smile:


3 Likes

We did a little piece about the current state of ai versus rendering.
It’s in Dutch but y’all know how to use Google translate :wink:

Funny how Diff correctly interprets your green scribbles to be plants, but the shadows of the trees are so weird :joy:

1 Like

Na . Something wrong with your cows haha

Maybe it is not AI but there are home design programs that give perfectly OK lighting and shading to a scene, just based on creating a mood. I’d like to see that built into SketchUp for quick review images.

Sorry, I’ve been dealing with deaths in my immediate family (mother, father, brother) for the past two years, and I have not had the time to keep up with the latest…I’m just now getting back into SketchUp as I design a new workshop. I’m having to play catch up, and I stumbled across Diffusion a couple days ago.

That said, I usually search first, and I skimmed through the posts in this thread and didn’t see anything about it. I would not have known to search “Diffusion Learning.”

Replies like this are why I’ve dropped out of some of the Tesla owners’ groups on Facebook, where every other post is answered with “didn’t you read the manual?” Responses to technical questions in a group setting could be a little friendlier…like “Hey, just letting you know I saw this covered in another thread…here’s a link.” Instead, I feel scolded.

Been using SketchUp since 2007…but this is something completely different.

Just read the information on that link, and it references things I have no experience with. As an artist, I’m just watching friends who work for movie studios who are struggling with AI taking over their industry, and I’m trying to be mindful of intellectual property. I just wanted to know what artwork or sources are used to train the AI? This tells me nothing: “The underlying dataset for Stable Diffusion was the 2b English language label subset of LAION 5b LAION-5B: A NEW ERA OF OPEN LARGE-SCALE MULTI-MODAL DATASETS | LAION, a general crawl of the internet created by the German charity LAION.”

Thanks for the info.

I might also add that my husband is an author, and after running one of his books through an AI tool used by university professors to determine whether students plagiarized, the tool said that 24% of a book he wrote in 2014 (when ChatGPT didn’t exist) was written by AI. We’re in uncharted territory with AI, and I’m trying to respect the work of others.

2 Likes

I second this @Aristodimos. FullHD at a minimum please. 2K is ideal (4K might honestly be too much for your servers considering they’re probably the one doing the rendering, but maybe one day…). I feel like customers would then start to complain about the response times if the resolution is too high.

I understand that Diffusion is not a rendering engine. My problem is I have a client that wants the visualizations for the project I’m working on to be less specific and detailed than the model I have made. Doing a Vray render would make it even worse. It would have been wonderful if I could have used the Diffusion “watercolor” style to make visuals that look very much like the model, with all the same colors I put in to begin with, distorting the line work a little but not much, but just looking like the scene was done with watercolors. This works moderately well with the pencil sketch style, but I can’t get good color visualizations. I think the ability to do this would be very useful for a lot of SketchUp users.

That’s not especially difficult to do without using Diffusion. Here’s a quick example.

What version of SketchUp are you using? Please update AND complete your forum profile.

You could just use a sketchy style from classic sketchup, overlaid with the colors/materials you want.