AMD and Nvidia leaks show we are drunk on power, and the hangover is going to be brutal

AMD and Nvidia leaks show we are drunk on power, and the hangover is going to be brutal

We’ve learned more about the upcoming next-generation graphics cards and processors from a number of recent news stories over the past few weeks, and if what we’ve heard is accurate, it appears that we’ve come to the conclusion that energy efficiency and conservation are for newbies and suckers.

First off, there have long been rumors that the next-generation Nvidia Lovelace graphics cards will be power hogs, but earlier this week trustworthy Twitter leaker Kopite7kimi posted some alleged specifications for a top-tier Nvidia RTX 4000-series card, possibly a Titan-class card, that could have as much as 800W of power draw.

Now, we’re hearing news from Wccftech that the soon-to-be-announced AMD Ryzen 7000-series desktop processors appear to be throwing off any pretense at efficiency as well, with a reported 170W TDP for the top-tier Ryzen 9 chip.

Subscribe to Our Epblogs Telegram Channel instant breaking news, deals, opinion to stay up to date on the latest news and reviews.

-epblogs

The processor and graphics card alone would consume about a kilowatt of power, so anything additional will unquestionably take this system over the 1000W threshold. This is assuming you couple these two components together and nothing else.

Without question, this would likely be the best gaming PC ever built, but is it even worth it at this point?

Do we really need this much power?

Many of the top graphics cards, such as the Nvidia RTX 3090 Ti with a TGP of 450W, are energy hogs. It is undeniably powerful, and it can make even the best PC games look amazing, but having had the opportunity to play these games on all of this top-tier hardware, I can honestly say that the 4K eye candy that you’ll get from an RTX 3090 Ti is very real, and the RTX 3070 or even the RTX 3060 Ti looks more than good enough for the vast majority of people.

For example, a 170W processor for an AMD Ryzen 9 7950X would undoubtedly be highly powerful, but its power would be completely squandered in the consumer market. With this much power, multitasking might be a breeze, but it’s become the processing equivalent of trying to balance a bottle on your nose, ride a circus bear, and juggle six knives at once.

It’s a remarkable achievement, but ultimately it’s simply for show. No one’s daily life actually requires this many tasks that call for this level of performance.

Prior to Alder Lake, Intel appeared to be making progress toward increasing CPU efficiency, but the 12th-gen chips appear to have undone much of that progress in order to restore the company’s prior best-in-class performance.

Accepting good enough

There is a belief that you should achieve this kind of feat every one to two years and that performance gains of only 1.25x or 1.5x can be deemed successful. For Nvidia Lovelace, some are predicting a 2x performance boost, and who knows what Intel Raptor Lake will offer.

At some point, we’re amassing all this computing power at the consumer level for the sake of amassing this power because we can. Then we just go and use it to stream Netflix.

This is not to argue that performance improvements are not worthwhile, but we should attempt to match performance with our needs rather than introducing this kind of performance and then searching for new applications for it. At the very least, this cannot always be the default assumption.

There is nothing wrong with Nvidia publicly stating that the RTX 4090 isn’t any more powerful than the RTX 3090, but that it consumes 50% less energy or costs only 25% as much. Value and efficiency seem to have been completely neglected, which is more unethical and not just a mistake.

Performance at all costs actually imposes real, concrete costs

There are two major issues with performance being the only metric that seems to matter anymore.

First, neither energy nor the environment or the economy are free. Currently, increasing carbon emissions are predicted to partially, if not completely, make large, densely inhabited areas of the earth uninhabitable. The trade-off is just not worth it because of our egregious misuse of finite energy resources, which causes us to produce additional carbon emissions in order to meet our genuine demands.

Most individuals think that the issue can be resolved tomorrow because the effects are thought to be far away in the future. That is simply untrue, as evidenced by the recent heatwave in Europe and the ongoing wildfires in the West of the United States, not to mention one of the worst droughts in recent memory in parts of the Global South that receives very little, if any, attention in comparison to middle- and upper-class families fleeing their suburban homes in California.

What will it take?

If that doesn’t persuade us to think more critically about what we mean by “progress,” let’s just point out a plain and obvious economic fact: achieving this level of performance will only drive up the price of these products, driving even more people out of the market as families struggle with inflation and rising energy costs.

The current generation of graphics cards is already out of reach for most because they are just too expensive. This trend looks to continue in the future, making essential technology for the modern economy something that only the well-off can afford, whether that means families or affluent gamers buying wildly overpowered showpieces or rich countries that can afford to make research investments in these increasingly expensive technologies while universities in poorer countries increasingly get pushed aside.

All of this is a recipe for widening social divides at a time when everyone is going to be under more pressure than ever from a changing climate for everything from vaccines to drinking water.

I completely understand because I adore computers and have always played PC games. However, I can also assure you that, despite being amazing, the RTX 3090 Ti’s performance drastically diminishes over time. It’s okay to say, “You know, 60 to 70 fps at 1440p is good enough,” at some time because, in all honesty, it is.

Leave a Reply

Your email address will not be published. Required fields are marked *