People don`t understand that those temperature limits are from engineering those chips. Engineers knows more about semiconductors than a bunch of hardware enthusiasts.
The worst are people mistaking heat generation with temperatures. Having a CPU at 95C while consuming 200W will generate less heat than a CPU running at 85C at 350W.
Temperatures are important for maintaining the integrity of the hardware. If it operates between the engineering limits defined, then the hardware will outlast your upgrading cycle.
People need to stop SPECULATING on things they don`t have a clue about. Thanks youtube for being part of this problem.
Problem? What problem? The only problem is that you believe everything an employee of large corporations says.
Engineers are just employees. If the CEO of Intel or AMD says that a CPU has to run at X temperature to achieve Y performance in order to reach Z profit margin, instead of making a bigger CPU that is more balanced in consumption and temperature, engineers are expected to comply with that directive. They don't hold the ultimate decision-making power in the company.
The higher the operating temperature of a silicon-based CPU, the more rapidly the silicon degrades. This degradation is caused by impurity diffusion and manufacturing defects such as thermal oxidation and doping. Although degradation is more rapid at higher temperatures, it can still occur at temperatures below 100°C, There are studies showing this.
There is a direct ratio between degradation and operating temperature. It turns out that CPUs are known to have a lifespan of decades, so even if the lifespan is reduced by 50% you still wouldn't complain because the product would have already become obsolete or failed for some other reason.