One thing to understand and remember is that “display precision” means exactly that: the number of decimal places or the smallest fraction denominator that will be shown in the display. It does not mean that your dimensions will be constrained or snap to that increment. It just indicates the smallest increment you want to see on the display.
Seeing ~ in things like dimensions, the entity info window and reports such as CutList means the value shown is approximate; there is a deviation smaller than the minimum set for display precision. This is an almost certain indicator that you are not modeling accurately.
As an explicit example, you can set the units precision to 1/4 inch and then enter an edge length of 5.3768". It will really be drawn to that weird length but will display as ~5 1/2".
There is a length snapping option in model info units, but it also does exactly and only what it says: the length of a line you draw will snap to that increment. It does not imply that you are working on a grid. If the start position of an edge was misaligned, the end position will snap to another misaligned point at some multiple of the length snap away. That leads to accumulated errors that can make a model very difficult to repair. For that reason, most experts recommend that you turn off the length snap option and leave it off! Instead, rely on typing exact values and using inference from known good points to get accurate lengths.