Relation between GPU utilization and graphic card's power consumption

The load factor shows how much more of the same computation could be done, not how much of the chip's total processing capability is being used for that computation.

For example, your 92% shows that on average, the GPU did something during 920,000 out of every 1 million clock cycles. It doesn't mean that 92% of every single circuit of every single shader processor was active, let alone 92% of every single circuit on the whole board (VRAM controller, DAC, shaders and raster units and branch predictors and texture lookup units and so on).

If your usage only takes advantage of a few GPU features, you might well run at 100% of the throughput of those features, while leaving half the chip asleep. But the half that's asleep couldn't be used for that type of work at all.