Probably one of the worst times for Nvidia to go all power hungry besides price bloat. It is also not just additional power for the GPU but also additional cooling for the room, plus power supplies are not 100% efficient, 450w/.9 = 500w from the wall. If AIBs go wild with power up to 600w, 600/.9 = 666w (Nvidia's magic number). I like to see what AMD has to offer, price/performance. This maybe the generation I will skip after over 2 decades of not skipping a graphics card release. Nvidia MSRP is utterly unreliable as hell, not interested in the FE since it is a huge bulbus brick, AIBs air cooled versions are worst, only a hybrid has a chance but the price maybe way above Nvidia's laughable MSRP guidelines.I invested in a significant solar/battery system last year as well (Yank in the UK). Energy costs have gotten ridiculous over here, even before the war. ~20% of my net now gets sold back to the grid. It feels good being able to power my house (and shortly car), and I'm intimately aware of my energy gain/loss. I now understand why my father yelled at us for leaving the lights on. Running a 450w GPU for as much as I work and game would put a bigger dent the curve that I have now.
The market dynamics are certainly changing, but I'm betting if most people did the math even at present, they'd be more interested in performance per watt and heat loss and dispersion.