s8n
varishangout.com
Hey dudes, I'm spreading the word. Make sure you're aware.
Imagine this scenario. You've been saving up for a long time for computer upgrades and you have the opportunity to get the biggest, baddest computer that money can buy for the first time in at least ten years. You're going to need it to last a good long time and you're going to need this computer to give you a return on your investment. You want to play some video games and will spend a lot of time doing it, or you're going to create AI images, or make music, or whatever else you do with your primary computer. You build it, everything works great for a couple weeks, then your house burns down and destroys everything inside.
That's the scenario we're looking at with at least the 5090 series card.
When designing computer peripherals we always leave a 20% power margin on any connector. If you have a 1-amp connector you pull 800 watts maximum. If you have a 500-watt connector you pull 400 watts maximum. This is an industry standard practice to cover issues like momentary load or DC inrush that could cause a cable to overheat and catch fire. We do this at every stage of the development process and at every point where there is a power interconnect. You are only allowed to pull 80% of the total maximum of the outlet. If there is an extension cord you may use 80% of the rating of the extension cord. If there is a power supply you may use 80% of the total consumption of the power supply, etc.
Early reports from multiple YouTubers show the RTX5090 pulling 570 watts out of a combined total maximum of 480 watts.
There are two possible scenarios here:
1. You buy it, you put it together, everything goes as well as it possibly can, and your computer catches fire.
2. Nvidia leaves the card as-is until launch day then releases a BIOS update that underclocks the card by 20%
Neither situation is good. Caveat emptor - do not buy the RTX5000 series at launch.
Imagine this scenario. You've been saving up for a long time for computer upgrades and you have the opportunity to get the biggest, baddest computer that money can buy for the first time in at least ten years. You're going to need it to last a good long time and you're going to need this computer to give you a return on your investment. You want to play some video games and will spend a lot of time doing it, or you're going to create AI images, or make music, or whatever else you do with your primary computer. You build it, everything works great for a couple weeks, then your house burns down and destroys everything inside.
That's the scenario we're looking at with at least the 5090 series card.
When designing computer peripherals we always leave a 20% power margin on any connector. If you have a 1-amp connector you pull 800 watts maximum. If you have a 500-watt connector you pull 400 watts maximum. This is an industry standard practice to cover issues like momentary load or DC inrush that could cause a cable to overheat and catch fire. We do this at every stage of the development process and at every point where there is a power interconnect. You are only allowed to pull 80% of the total maximum of the outlet. If there is an extension cord you may use 80% of the rating of the extension cord. If there is a power supply you may use 80% of the total consumption of the power supply, etc.
Early reports from multiple YouTubers show the RTX5090 pulling 570 watts out of a combined total maximum of 480 watts.
There are two possible scenarios here:
1. You buy it, you put it together, everything goes as well as it possibly can, and your computer catches fire.
2. Nvidia leaves the card as-is until launch day then releases a BIOS update that underclocks the card by 20%
Neither situation is good. Caveat emptor - do not buy the RTX5000 series at launch.