Breaking news
For sale: Magura MT5 brake | Happyride.se -
For sale: BrakesShimano Tiagra (a pair). -
For sale: Trek top fuel 7 -
Historic victories for Stora Tuna in Tiomila -
At least 75 dead after torrential rains in Brazil -

Are Intel/AMD/NVIDIA the last overclockers?

--

Well-known overclocker Tobias “Rauf” Bergström takes the tone and believes that hardware manufacturers are pushing their components so far that the breaking point is close and overclocking has basically become redundant.

This is a column by Tobias “Rauf” Bergström. Here he presents his personal opinion.

Overclocking, i.e. tuning a computer component to better performance than its original design, has existed basically as long as computers have existed. Somewhere around the turn of the millennium, however, the heyday began when every other person with the slightest interest in computers overclocked their components.

The allure was in getting free performance and actually being able to play games that otherwise wouldn’t have been able to run on their system. Back then, there weren’t many graphics settings and 30 fps was considered good fluidity. The margins for overclocking were also greater. It was not impossible to be able to buy a graphics card for SEK 2,000 and have it perform as well or better than the worst model on the market, which at the time cost around SEK 4,000.

In recent years, however, the interest and benefit of overclocking has decreased significantly. There are still quite a few who appreciate testing limits, pushing their system and seeing what it can do. But the everyday overclocking has largely ceased. Or does it really have it?

Overclocking developed the industry

If we look back in time to the glory days around the turn of the millennium, it is very clear how big an impact overclocking had on the component industry. Look at the cooler size and power evolution of a graphics card from the early 2000s.

What did the overclockers do? Well, we mounted a CPU cooler on the graphics card, with a much larger cooling capacity. After that, we were able to tune tension and performance to completely different levels. And that without the card having to sound like a jet engine. Manufacturers soon followed suit and larger and more powerful coolers became standard, thus opening up more power and more powerful hardware.

There are plenty of other examples where overclocking has shown the way forward for the industry. Water cooling, overclocked RAM led to XMP and EXPO profiles, better power regulation, more power connectors and more. Above all, however, there are two things that have really left their mark: better utilization of clock frequencies and power development. It wasn’t that long ago that a processor kept the same frequency regardless of whether it was loading one core or all cores, which of course left a lot of unused capacity. Then the boost functions were invented, which are now standard with all the big three, Intel, Nvidia and AMD.

The boost functions became the new overclocking

The boost functions are in practice nothing more than an automatic overclocking function that kicks in when temperature, load and power allow it. The Boost functions have since developed and become more and more refined and now push the components so close to their maximum capacity that there are no longer very large margins for manual overclocking.

Basically, it’s just fine. The manufacturers have found ways to make better use of the hardware without constantly having large safety margins compared to what it can actually handle. However, problems have arisen in the form that the development of manufacturing techniques has stopped, Moore’s law cannot be considered to apply anymore. This has caused manufacturers to have to resort to new tricks in order to produce new, faster products. When manufacturing technology fails, manufacturers resort to the only thing they can: power!

The manufacturers have pushed the “overclocking” too far

In recent times, manufacturers have pushed their products by constantly increasing power development in a way that can no longer be considered healthy. For those like me who deal with a little more extreme overclocking, we are of course no strangers to pushing up the effect. But it applies for short moments and then to push the components and chase records.

Running the components at very high power continuously does not actually make sense. Even if the cooling can handle it, it draws a lot of power and produces a lot of waste heat. It often also results in relatively loud fans that are supposed to transport the heat away. The children’s rooms feel a little tropical when I walk into the room after they’ve been playing for a while and they still don’t have the worst components.

Cyberpunk 2077 Nvidia RTX DXR Raytracing Geforce 436.51

That graphics cards today can draw up to 600W and processors over 300W is actually not reasonable in my eyes. Above all, it is those last few percent of performance that are squeezed out to a very large power increase which is so strange that the manufacturers persist with it. Is it really worth 40% higher power for 5-10% higher performance, which it can be about for example RTX 4090.

Undervolting has taken over overclocking

A clear example of the fact that the manufacturers have actually gone too far in their pushing of the products is that enthusiasts no longer overclock. Not just because the margins no longer exist, but because they actually prefer slightly lower performing products.

Xeon w9-3495X

The goal is of course not for the products to perform worse, but that is often a side effect. The goal is to find settings that, for example, provide the best energy efficiency. by lowering the voltage, power limit or frequency. By testing around among these settings, it is possible to get significantly lower energy consumption and cooler and quieter components without the performance being more than marginally affected.

It may be that the pendulum has swung a little too far, and it’s time for the manufacturers to take a step back and give us a little more well-balanced products. NVIDIA’s 5000 series is not far away, the future will tell if they have adopted the many shortcomings of the 4090 cards. Or if there will be another generation pushed to its limits so hard that even the power contacts melt.


The article is in Swedish

Tags: IntelAMDNVIDIA overclockers

-

NEXT Saga Cavallin: Comic ads show holes in surveillance