Nvidia 5000 series extreme fire hazard

s8n

varishangout.com
Hey dudes, I'm spreading the word. Make sure you're aware.

Imagine this scenario. You've been saving up for a long time for computer upgrades and you have the opportunity to get the biggest, baddest computer that money can buy for the first time in at least ten years. You're going to need it to last a good long time and you're going to need this computer to give you a return on your investment. You want to play some video games and will spend a lot of time doing it, or you're going to create AI images, or make music, or whatever else you do with your primary computer. You build it, everything works great for a couple weeks, then your house burns down and destroys everything inside.

That's the scenario we're looking at with at least the 5090 series card.

When designing computer peripherals we always leave a 20% power margin on any connector. If you have a 1-amp connector you pull 800 watts maximum. If you have a 500-watt connector you pull 400 watts maximum. This is an industry standard practice to cover issues like momentary load or DC inrush that could cause a cable to overheat and catch fire. We do this at every stage of the development process and at every point where there is a power interconnect. You are only allowed to pull 80% of the total maximum of the outlet. If there is an extension cord you may use 80% of the rating of the extension cord. If there is a power supply you may use 80% of the total consumption of the power supply, etc.

Early reports from multiple YouTubers show the RTX5090 pulling 570 watts out of a combined total maximum of 480 watts.

There are two possible scenarios here:

1. You buy it, you put it together, everything goes as well as it possibly can, and your computer catches fire.

2. Nvidia leaves the card as-is until launch day then releases a BIOS update that underclocks the card by 20%


Neither situation is good. Caveat emptor - do not buy the RTX5000 series at launch.
 

Alucardthegreat

varishangout.com
It is too expensive compared to the amount of new games that demand more performance and aren't woke (pretty much almost none). Nvidia should just make a 16gb gtx card with midrange pricing and 250-300w power draw. Would sell like hot cakes.
 
  • Like
Reactions: s8n

s8n

varishangout.com
or just some more 24gb cards with 300W power draw those would fly off shelves like you wouldn't believe. Even if it's a previous generation of GDDR
 

John Doe

varishangout.com
Only mindless nigger cattle buy these overpriced high end/enthusiast GPUs to use for gaming. They deserve the housefires.
 

Ravielsk

varishangout.com
or just some more 24gb cards with 300W power draw those would fly off shelves like you wouldn't believe. Even if it's a previous generation of GDDR
The lack of VRAM is quite literary a deal breaker for these cards. Anyone who uses them at a professional level needs more RAM than performance. By not increasing the amount of VRAM by even a bit NVIDIA essentially ensured there is no reason to even contemplate upgrading.
 

Alucardthegreat

varishangout.com
They do not support Physx anymore. Not gonna buy period. AMD may be the only choice once rtx 4000 becomes obsolete for me.
 

s8n

varishangout.com
AMD's new card is 300W and a reasonable price, it looks like if it works and is available it's the new de-facto industry standard. If they can do it 3 times in a row they replace Nvidia as the primary vendor for consumer graphics hardware

it looks like it has 16GB of VRAM which is enough (not perfect, but enough) for most current-gen Stable Diffusion work
 

Lusty Daicon

varishangout.com
If they can do it 3 times in a row they replace Nvidia as the primary vendor for consumer graphics hardware
Jensen wont let that happen unless he's already decided to exit the gaming market. AMD's opportunity now only exists because Nvidia messed up so much. That's an external factor they have no control over.
 

s8n

varishangout.com
they did basically announce they no longer wish to support the gaming market but I don't know how serious they are. It basically looks like they've abandoned the consumer market with their pricing. $1000 for a video card is only attainable for at best the top 25% of income-earners in the PC space and even then only once every 5 or 6 years. The average gamer should be running on a $300 video card at 1080p and the high end gamer should be running on a sub-$600 card at 1440p. 4k gaming is for if you are making over 120k a year and also hate yourself and want to have stuttering and obnoxious display issues on the desktop
 

Lusty Daicon

varishangout.com
Personally I want a 5080 because I have a hobbyist interest in AI and don't currently have a GPU. For weeks I've been trying to get it AT MSRP. There's used 3090s, but although they have more VRAM, I heard they're slower, they don't AV1 encoding, and I don't really want to buy something which was potentially used for mining. AMD is just not there yet for that use case; maybe when UDNA comes out that'll change.
 
Last edited:

s8n

varishangout.com
the 5080 has a physically damaged processor with some of the circuitry unintentionally disabled and the MSRP is over a thousand dollars. I hate to say it but if you want one you're stupid
 

Lusty Daicon

varishangout.com
the 5080 has a physically damaged processor with some of the circuitry unintentionally disabled and the MSRP is over a thousand dollars. I hate to say it but if you want one you're stupid
Suggest a better GPU I could buy a brand new one of that has CUDA support, at least 16GB of VRAM and costs less than $1000(which is the MSRP of the 5080, not under it). I know the issues and I have the money for it. If it has some problem, I can RMA it.
 
Last edited:

s8n

varishangout.com
Suggest a better GPU I could buy a brand new one of that has CUDA support, at least 16GB of VRAM and costs less than $1000(which is the MSRP of the 5080, not under it). I know the issues and I have the money for it. If it has some problem, I can RMA it.
AMD's new card is 300W and a reasonable price, it looks like if it works and is available it's the new de-facto industry standard. If they can do it 3 times in a row they replace Nvidia as the primary vendor for consumer graphics hardware

it looks like it has 16GB of VRAM which is enough (not perfect, but enough) for most current-gen Stable Diffusion work
 

s8n

varishangout.com
ngl I don't know fuck-all about CUDA. Never used it, I've always had more CPU than GPU. A GPU is just a floating point calculation accelerator for my desktop. I only really care if it plays video games and if prebuilt AI tools work on it. If pytorch works that's all I need. It can run an LLM and https://github.com/lshqqytiger/stable-diffusion-webui-amdgpu-forge ought to work fine with zluda.

I think spending $1000 on a video card is insane, and only getting 16GB of VRAM for it is insulting at best. I paid $500 for this card with 24GB last year
 

Lusty Daicon

varishangout.com
I agree that for most people AMD is the right choice, and I really hope they gain a lot of market share. If someone has no interest in AI, it's not even a question.
 

s8n

varishangout.com
the Gamer's Nexus review for the Radeon 9070 (non-XT) is absolutely glowing. It hits a metric I would have said was impossible. I would have bet money on that

the Radeon 9070 matches the Nvidia 5080 in performance-per-watt which is completely insane

 

Lusty Daicon

varishangout.com
Ended up buying a Pro Art 4070 OC edition for $700 because I'm sick of waiting. When AMD or Intel catches up, I'll get one of theirs.
 
Top