As AMD and Nvidia begin to roll out their latest graphics cards, one thing is clear: AMD is re-establishing itself as the industry leader when it comes to affordability, and I couldn’t be happier.
I’ve had the privilege of playing many of the best PC games over the years using the best graphics card on the market, as well as many of the best cheap graphics cards at one point, and a number of things are coming into focus in ways that may not have been visible before the RTX era.
First, we all know that graphics cards are getting more expensive, especially high-end cards, and in the Ampere-and-Big-Navi era, there was a narrowing of the gap between the two major card makers in terms of price (excluding the RTX 3090). and RTX 3090 Ti, which didn’t have a competing AMD Radeon RX card to price against).
We can also recognize that the simple fact of using these cards makes it much more expensive, both in terms of the additional hardware required and your actual electricity bill. In fact, not long ago I wrote a damning opinion on this subject.
Now that AMD has released its Ryzen 7000 series chips, and especially after it announced its Radeon RX 7000 series graphics cards, I realize I may have been too hasty to lump AMD with the worst offenders in this regard. .
When good enough is good enough
One of the things that has haunted the best graphics cards is that you really get to the point where you have way more power than you really need, and the RTX 4090 is a perfect example of that.
It’s without a doubt the most powerful consumer graphics card in the world, but unless you’re a creative pro who needs this level of raw performance, it’s definitely overkill for anything else.
Yes, it can play Cyberpunk 2077 at 4K at any max setting and get above 40 fps natively, but what’s the point? You can do much better with an RTX 3080 with DLSS set to performance. And honestly, it looks just as good, especially if you don’t compare the two side by side.
And that’s taking into account that Cyberpunk 2077 is one of the most taxing games out there. Most PC games don’t go nearly that far.
The RX 7900 XTX, meanwhile, seems to land somewhere between the RTX 3090 and RTX 4090 in terms of performance, which is just about everything you’ll ever need for gaming.
Beyond this point, you’re really just paying the extra $600/£600 for the bragging rights. Even the Nvidia RTX 4080, which is not yet on sale, has a significantly higher suggested retail price. So even if you compare the Radeon RX 7900 XTX to its mentioned competitor, it comes out ahead.
Ultimately, if the RX 7900 XT and RX 7900 XTX come close to the performance promised, it will be very hard to recommend anything other than the super-enthusiastic set.
About those power cables…
There is also the issue of the 12VHPWR power cable that Nvidia has adopted from the RTX 3000 series.
This cable, which takes four standard 8-pin connectors that come with all recent power supplies and converts them into a single 16-pin power connector, has been in the news lately. RTX 4090 customers have reportedly seen their very expensive graphics cards burned out due to faulty power adapters and, in at least a few cases, native 12VHPWR cables from ATX 3.0 power supplies.
We’ve seen nothing wrong with the power cable on our RTX 4090 review unit, and without the results of an official investigation by Nvidia and its partners or independent testing that can verify the issue, it’s better to treat it as potentially isolated incidents where those individual cables rather than a more systemic problem (for now).
But you know what? is a systemic problem? Creating a proprietary power adapter that requires additional investment from consumers who have already invested a lot of money in a graphics card. Sure, it comes with an adapter, but there’s something to be said for a graphics card that just uses the same 8-pin connectors everyone else uses, and that’s the route AMD took for the RX 7900 XTX. Point, AMD.
And those power requirements…
There’s a new benchmark indicating the RTX 4090 Ti is coming, and while the RTX 4090 Ti looks impressive by the numbers, the RTX 4090 already has a 450W power requirement, which can be overclocked way north of half. an insane kW level. What does an RTX 4090 Ti look like? Do we really want to know right now?
Ad campaigns are currently running (opens in new tab) to make people in the UK flee the high energy costs expected this winter by taking a 30 day trip to Europe as it is cheaper than heating their homes. Is it exaggerated? I have no idea, but the resigned shrugs I hear from some British colleagues about the prospect of higher energy bills tells me it’s at least capitalized, if not factual.
Aside from climate change and the many problems inherent in that nightmare, Nvidia and Intel seem to have decided that the way to stay on top is to brutally force their way to dominance by pushing as much power as possible through their transistors, which an increasingly expensive proposition.
Even in the US, utility bills are higher than they used to be and have an obscenely powerful graphics card or processor or both for the pleasure of 30 to 40 fps higher than the 90+ fps you’d get with a lower power card, it’s just not a worthwhile trade-off for the vast majority of people.
It was probably the biggest complaint in my aforementioned op-ed, and it seems AMD is at least making progress. It’s incredibly impressive to keep the RX 7900 XTX’s board power at just 335W, and if AMD was able to squeeze the kind of performance it claims from relatively low power consumption, I’d be sold.
Add to that the fact that the AMD Ryzen 7000 series isn’t the most powerful out there, and it isn’t exactly low-wattage, but it’s a big step ahead of Intel’s targeted, more-power-to-the-problem approach to better performance.
We’re yet to see how well the RX 7900 XTX and RX 7900 XT perform, so time will tell, but at this point I’m already sold to AMD this generation, and I can’t imagine being the only one.