As an example, are the simple textures used in a model sent to the Ai as irrefutable data about what should be grass and what should be asphalt. I can’t us 98% of what is generated given any setting combination or style to keep what’s inside my model as their materials. The simple discernment of a driveway and grass is seemingly impossible to depend on or prompt to minimal success.
It’s a cool tool in concept, but really, the accuracy of other generative Ai tools is so much more advanced and contained to idea… seems like Diffusion would be even MORE accurate given the raw data set of geometry and materials, but hey I’m just an Ai genius… right?
CHEERS… More development please… this is just becoming lower value every day for me. Thank you!
In the images… This is my model, these are my outcomes… I am very familiar with proper prompting to help contain the Ai’s understanding. Clearly, this isn’t useful.