I investigated this - it’s getting a bit technical, but here’s what I’ve learend:
For those that arent aware, Nvidia Optimus uses both the ingrated GPU (part of the Intel CPU chip) as well as the Nvidia RTX GPU. The idea being that the integrated (tiny, weak, but efficient) can be used for office tasks,to save battery life. Then the RTX is used for the heavy-duty tasks. That could be very useful since a high-powered Laptop has terrible battery life (battery size being something that is regulated by the airline industry).
Until the 3000 serries RTX, only the lower spec GPUs came with Optimus (2050 and below). Now it’s available on higher-end models, adding to the appeal…
Using Optimus, the integrated GPU often causes a bottleneck, because the RTX isnt completely independent from it. In some applications, there is a 50% performance penalty if a laptop has Optimus, compared to the same hardware spec without Optimus. That makes it unwortwhile to get the high spec RTX in the first place!
What does this mean?
Information about specific laptop models - and performance testing/benchmarks - is critical. Even cable choice can have a big impact (using the USB-C cable to connect an external laptop is aparently the best option; using HDMI could really hurt performance since HDMI usually throughputs the iGPU).