It’s not ultimately preventable, but an application can either ignore it or choose a threshold at which it rarely kicks in.

The problem is that computer arithmetic is finite. To illustrate, suppose that your computer worked in decimal and could only represent values using three digits. Then the best possible representation for 1/3 is 0.333. But 3*0.333=0.999 which is not 1. So if SketchUp during an intersection operation calculated a point location as 3*(1/3)=0.999 there would be a problem if another point is located at exactly 1. Should they be the same, or is the calculated point different? Having a threshold for “same” is the only solution.

If nearby Vertices are not fixed up as being same, all sorts of other issues would result. Models would bloat because there would be potentially vast numbers of Vertices all very close to each other. Faces would fail to be created because their “corners” would be split into two points instead of one. The messes would be far more frequent and uglier than the existing scaling problem.

So, the real question for SU 2016 isn’t whether such a threshold is needed, it is whether the value could be better chosen or somehow made smarter so that there are fewer situations where novices are confused and frustrated by this behavior.