20 Reasons to consider learning Blender

Here’s a tutorial I did that may be of some help with dimensioning.

export obj from SU to get that into blender.

@chippwalters have you any advice on getting hard shadows like in SketchUp. In work I have great success with still images in SketchUp but I also have a need to animate in the same style. I know it can be done in Blender but I have not yet found a good way.

Blender had a render mode in Cycles called Freestyle which can do what you suggest. I also think it might work in EEVEE too.

You can create such effects via material/shaders with Eevee. Check this video:

Thanks @chippwalters @filibis I will check both of those out in work today.

Just a heads up for anyone wanting to continue to use SketchUp as their primary excellent modeler, and then only use the new FREE Blender 2.8 EEVEE realtime photoreal renderer for importing their SU models (textures intact) and adding / updating materials, lights, and creating renders and animations. On my 3 yr old PC I can render a fairly complex interior scene at 8K in less than 15 seconds. I can animate a 3 minute video in an hour. All pretty cool. And I can still continue to use SketchUp to model in.

Consider the below image. On the left is the original photo, on the right is the EEVEE render in under 15 seconds (after a 4 minute bake which you do once).

This course is targeted at SketchUp users, and there were many beta testers, big names, from the SU community involved. It’s called Definitely EEVEE: Definitive Interiors and if you’re a SketchUp user, you may want to take a look at it:

4 Likes

Hi,
Take a look at Archipack 2.x add-on for blender 2.8, a set of full parametric architectural primitives ! Beside eevee, probably the best reason to learn blender for any arch-viz user.

Blender archipack official website

5 Likes

Yep! Truly a great resource!

This is cool:

2 Likes

Blender 2.80 official release. Woohoo. Very slick & the doesn’t cost a cent. Pretty impressive.

2 Likes

chipp, is it possible to render full animations (for example flying a camera thru architecture) in Evee?

It’s possible!

1 Like

Thanks a lot monospaced!
Do you use blender on your Mac? Can you tell me a little about the performance (in Evee with your Radeon 560?)

I don’t. Sorry. But there are countless videos and animating with eevee is blenders purpose one might argue.

As @monospaced said, yes you can. In fact you can enable motion blue on the camera as well. Each frame takes a few seconds to render. I go over all of it in my course, Definitely EEVEE: Definitive Interiors.

It works on Mac as well, but probably not quite as fast. All depends on your hardware config. You can ask over at BlenderArtists.org.

Someone asked me with regard to this thread if I’ve tried Unreal or Twinmotion as possible rendering solutions and I thought I’d share my reply:

I have not used either of those but I have had a couple years experience with Unity before switching to Blender.

Here are my thoughts. I’d rather use an all in one solution that makes no tradeoffs for a few reasons.

  1. I can be more iterative, going back and forth between design and final renders-- in fact I can even model now in fully lighted and rendered mode.

  2. Blender’s node based shaders allow for more interactive realism. Consider this video:

This was rendered AND the live characters composited in Blender’s EEVEE real-time rendering system in a single pass. No compositing passes-- think of the people as alpha mapped trees in SketchUp-- only with animation. It was basically modeled and rendered by a single individual.

Now of course that’s not for everyone, but it does show what’s possible.

  1. I prefer NOT to have to learn and keep up with several different software applications-- not to mention the ever changing subscription models and yearly maintenance fees.

  2. I can easily model anything I can think of in Blender. Consider this rather complex object which is modeled in only a couple of hours, then rendered both SketchUp style and in EEVEE.

H ope that helps you understand where I’m at.

7 Likes

I love the “Touch of Evil” style continuous shot. I’m really curious how the coordination is done between camera stats relative to the virtual world and relative to the live actor.

That’s a real good question. I believe this is all done via eyeball as the characters themselves are actually “part of the scene.”

Check out this video below and pay particular attention to 2:27 where it shows an alpha mapped character “in the scene” in Blender’s interface. Also watch the video and see how he cheats her walking across the walkway after the truck drives by-- she literally takes only one step.

I believe doing things this way removed the need for any but the most basic original motion tracking, which can also be done in Blender as well.

2 Likes

Excellent, thanks. One of the last “camera movements” is actually the actor turning, and then turning back again.

1 Like