I have recently switched from 3dsmax/V-ray to SketchUp/Enscape because I can’t afford working “blind” and waiting for renderings. But now I wait for SketchUp to do everything else, so I’ve had no efficiency improvement at all.
I have followed many tips to make Sketchup “fast”, but compared to 3ds Max’s ability to handle lots of geometry it performes… well, just terribly. I spent approximately half the day looking at the spinning waiting icon-thingy and feel utterly unprofessional.
I only stick to SketchUp because I can use Enscape from it.
Have you tried using SketchUp and V-ray together? Or are you also specifically trying to get realtime rendering (you mention Enscape) instead of offline rendering?
Often the performance challenge in SketchUp comes from the way you are rendering vegetation (trees, grass, shrubs); if you are trying to model in SketchUp with full geometric representations of trees (polygons for every leaf)and such, you will have performance problems no matter what. V-ray in SketchUp offers a proxy system for vegetation likely more similar to what you were using previously in 3dsmax.
Perhaps if you could share some examples comparing performance, others here could offer you other tips and tricks to manage the system effectively.
I use proxies for polygon-heavy objects and vegetation. Enscape has a good system for that, I also use Skatters “render only” feature when a lot of things need to be distributed around. It is “simple” things like importing, modelling and moving things around that compiles to a lot of waiting time. I don’t have any spesific examples, it’s just the daily grind. It’s almost as if I have to shut down the Outliner and disable all extensions just to import something without having (to take) a coffee break while I wait.
Hey guys. I copied the following message from the other forum (cause my english is not that good, so sorry). I am exactly at this situation, so it would be nice if you guys tell us if we would see any significant change in Sketchup’s high poly handling.
"hi you all, this seems like beating a dead horse. but hey this will be my first and last post on this topic.
so bear with me.
sketchup users who think sketchup doesn’t need to handle heavy models for 3d rendering have their reasons. Their reasons are:
you can export and compile sketchup models into a 3rd party rendering program like thea. thea can handle very high poly counts. BUT to me doing interior rendering job(i’m no pro at all), my workflow is quite “messy”. i like to try different models for one particular scene. if i don’t like that plant or vases or books… in that scene, i do a preview render then i can change them. the reason i don’t think this rationale will work for me is, my scene is not finished or finalised. if i keep importing and exporting to 3rd party rendering studio. the time will be longer. if i can just STAY in sketchup to do the rendering testing, the process will be faster.
you can import proxy models of heavy models. same thing. it won’t work for me. what if i want to make some changes to that heavy models? the proxy models can’t be changed in sketchup. i have to reimported back to my current skp file. also proxy models can be buggy. if I sketchup can handle heavy models, I can just stay in sketchup. happy!
simply, if sketchup can handle heavy poly models, i don’t need to do so many workarounds. because workarounds waste time. That is my point here. if i can stay in sketchup, i can save a lot of time. i can have more time tweaking and creating. i can go thru many iterations or minor changes directly in sketchup.
I think i will move away from sketchup if i have to import heavy models for a scene. i Don’t like workarounds.
I hope i will find a program that will handle heavy count models and do nice fast rendering.
i DO like sketchup for it’s measuring tools, snapping tool… it helps me quickly create very accurate models.(i build custom furniture models in sketchup).
if my post offends you, because you think it’s because my lack of expert knowledge of sketchup. you are prob right. but hey a lot of people want to do 3d interior rendering in sketchup have the same problems.
like others i don’t think sketchup will ever be able to handle high poly models."
You can say whatever you like and espouse all kinds of clean modeling and efficient techniques. But at the end of the day Layout and SketchUp need major improvements in stability and speed. Especially Layout. That should be the single most urgent work of the software developer!!!
I’m finding that the size (mb) and complexity (edges, faces, groups, etc) of my projects is growing very rapidly over the last few years.
Current projects are 1gb + and today’s challenge has been to arrange 6 OBJ files (a photogrammetry mesh) generated by Trimble Realworks, which total 2gb+. I don’t have any hope of being able to manipulate this data in a reasonable manner.
Professionals import/export various data sets (including contours, meshes, point clouds, BIM building components, etc) and these are all getting ‘heavier’ as technology improves and demand for more detail in projects rises across various industries.
SketchUp has gotten a bit speedier in recent releases, but we are seeing small improvements, without fundamental improvements to the WAY SketchUp handles with large files, imported data, etc. I’m sure this is being worked on but at present it does appear the BIM/Rendering world is leaving sketchup behind a little.
PC hardware improvements dont seem to be helping - the fastest CPUs and GPUs dont run SU or LO appreciably faster than mid range, or older models. We can’t do what Renderers do an stack CPUs/GPUs or use the Cloud to boost performance…that would be nice
Perhaps some core SU functions may be able to be improved (and there are probably 3rd party options), eg:
a Graphical LOD adjustment (similar to Revit). I think somebody in a topic previously suggested hiding the model, or details (eg edges) on all but the active group (plus maybe 1-3 levels of nesting). I think this would be fantastic.
Turning off Snapping/inferencing on all but the active groups plus 1-3 levels of nesting)
allowing different styles to be assigned to the Group/component or Layer, eg some groups would be viewed as Hidden Line while others could be fully textured and have profiles/dashes turned on, or xray, etc.
Data import/export management, eg batch processing of imported data, processing of meshes, contours, etc before loading into SU models, and processing/removal of unwanted items from imported date, eg certain layes, image files, text/anots, contours, etc.
Swapping between low-res / low-poly and high res / high poly within SU, or perhaps using the High Res versions for export an dprinting (similar to how LO displays things)
Convenient directories to manage proxies and other high-detail versions of materials, objects…and linked to the Component/Material browsers so things can be organised more easily, deleting duplicates and viewing heirarchies to allow us to manage the model more efficiently.
more tasks happening in the background or on other CPU cores, especially updating LO, saving, etc.
incremental saving (save only the recent changes, not the entire file)
I’m just curious, how large are these projects that get over 1Gb? A typical 9000 sf house for me modeled to a high level of detail with all trim, appliances and site (usually 1-2 acres mountain terrain) never hit 100mb for reference.
I’m doing an interior project (basically 3 rooms) that is over 500mb with no texture included. The models are high-poly models for good quality rendering, and I don’t want reduce their polygons so their segments not going to appear in closeup shots. Every work has different needs, my exterior scenes are lower in poly than my interiors. I want to learn softwares like zbrush or marvelous designer and bring them to my interior workflow, but I know sketchup can’t handle those kind of high-poly models.
Performance is a complex subject. I’m sure you’ve heard this from other sources, but I’ll include it here again for others to read in the future. There is always going to be an upper bound to the performance of your system, and you generally won’t feel that threshold as a limitation until you have reached it.
We improve SketchUp’s core rendering performance with every release. This enables users to throw higher polygon counts at it with every release. Until the (new) threshold for complexity is reached and things slow down for them again. To me, a 1gb SketchUp model is spectacularly large, though to you it may feel like the most natural thing in the world to make. Until it isn’t. We will continue to improve SketchUp’s rendering performance with every release.
With offline rendering (in, for example, V-Ray), computational performance is not determined by the polygon count of the scene. With realtime graphics (like Sketchup, or Enscape) on the other hand, you have to pay a computational tax for every polygon you add to the model. This is why the creative teams on big game development projects are so fastidious about their polygon budgets.
In general, you don’t want to carry a bunch of polygons around in the model all the time just to get a localized final rendering effect. Fabric objects (upholstery, curtains, bed linens, etc.) and plants need to be carefully optimized if you plan on rendering them up close in the scene. This is going to be true in every modeling environment you use, not just in SketchUp.
This is certainly going to be a problem for you, and I don’t recommend it as a strategy. I would recommend decimating the Realworks mesh before trying to load it into SketchUp. Without more information, I can’t offer a solution for the OBJ files, but I would be suspicious that they may have been generated with too much resolution as well. OBJ is usually used as an interchange format; where was the original geometry in those files generated?
Another thing to keep in mind is what you need it for?
In the end, in the output, all polys wind up as pixels.
(Paper dpi, monitor pix/inch)
A watch in an advertisement of a magazine needs more poly’s than the watch of Marc set aside a building.
Is it accurate to compare Sketchup with Enscape? I mean Sketchup is a modeling tool and it had to be compared with other 3D modeling softwares like 3Ds Max, Blender, Cinema4D, etc. But Sketchup’s high-poly handling is not even near to industry standards.
I’m pretty sure that 3Ds Max’s viewport renders in realtime too. There is light, shadow, texture, some sorf of reflection, transparency, lots of edges and faces and many other things that go on 3Ds Max’s viewport. Of course, there are limitations in Max’s viewport in handling high-poly too, but we can see it can handle all those things way better than Sketchup. (Max is an example, you can say those things for Blender and Cinema4D too.)
Actually I do optimize my scene. I use V-Ray Proxy for plants and vegetation, as anyone does in the visualization industry right now. But I never saw in other softwares someone proxify a bed or a chair or a curtain for a simple scene with 3 rooms, which I do in Sketchup
You guys offer polygon decimation, but it’s a workaround and not a proper solution for the actual problem. Most of the time loading a high-poly model in Sketchup is not the problem, the bigger problem is when you want to edit that model! literally that model is not editable because of the slowness and eventually, a bug splat appears those times I wanna kill myself
Well yeah, after all, final result matters. The watch example is a good one. I can’t decimate this high poly model and expect to get the same result in such a closeup shot. You may say “Then change your 3D modeling software! why you still whining about it?” well, the reason is I like Sketchup and I learned methods of modeling in Sketchup through years. These methods, of course, are different from 3Ds Max and may be inefficient in those softwares. so…
You know, I’m not fluent in English and I have to read that article several times to understand it. Blinn’s render law is true.
Blinn’s Law, which asserts that rendering time tends to remain constant, even as computers get faster. Animators prefer to improve quality, rendering more complex scenes with more sophisticated algorithms, rather than using less time to do the same work as before.
I don’t know a bit of coding and program making, but I hope someday Sketchup can calculate and process as fast as other modeling packages.
I would guess that we users will never think that we have got enough performance. But, over the years, the improvement in SketchUp’s high polygon performance has been dramatic. I tend to think that a model with 100 000 faces is huge, but in practice that used to cause problems with SketchUp version 8. Nowadays a ten times larger model is still quite navigable. Of course, we have progress with hardware too.
Sometimes “quantum leaps” do happen. I am old enough to remember AutoCad version 2.5. At that time the application used to slow down to a crawl if you turned running object snaps on. Then Autodesk went and rewrote the osnap system for version 10 (if I remember right) that led to the biggest performance increase in the history of that application.
It is in the sense that both are doing realtime rendering (like a game engine), not offline rendering (like raytracing). Of course, SketchUp’s rendering is optimized for 3D modeling tasks, whereas Enscape’s is optimized for photorealistic presentation. 3ds Max, Maya, Blender and Cinema4D include both realtime rendering (in their modeling viewports) and offline rendering (through their embedded rendering engines).
@MikeWayzovski’s invocation of Blinn’s Law is a good one. This question of performance in modeling and rendering is an old one, likely without a fully satisfying resolution in any known system.
Comparisons between the packages you mention are tough to make. How sure are you that you are actually comparing the same things? Model translation, for example, has a nasty habit of doubling or even quadrupling polygon count. In addition, unless you are carefully managing things like display style, you can end up with radical differences in raw performance. I’m not trying to dodge your claim (to be sure, people who come to SketchUp from one of those big DCC applications often complain of similar things) but I think the question is likely more complex than it may seem at first glance.
I’ve modeled every stud, door, anchor bolt etc… and even at that it is only coming in at about 5 Mb. The largest architectural model I’ve ever created is usually no larger than 10 Mb, and that includes surrounding terrain, trees and landscaping.
I guess if I was to try and model every shingle and siding board separately I might have a much different model on my hands.
How are we getting models that are in excess of 100 Mb?
I guess I could see that with a model that contains a lot of curved geometry the number of polygons goes up exponentially.
I have imported vehicles and other 3rd party models into my models from the warehouse and there have been times where those models have been extremely large.
My philosophy is to keep your model as lean and mean as possible, even if you do have the horse power to handle millions of polygons. Why try to represent more information than you need?
Call it what you want: dead weight, bloat, overhead etc… the point is that we have become lazy in or coding and modeling habits because of the “unlimited” computing power that we have been given in the last 30 or so years.
Just because we have 16 Gb of RAM installed in our desktop PC or Mac doesn’t mean we should try to max it out. In my mind there is nothing more beautiful than an efficient block of code or a detailed but streamlined model.