NVidia GeForce RTX 40x0 cards - x2 the power consumption, x2 the performance

spronk

FPS noob
22,624
25,683
The problem appears to be that it burns if its slightly out of its pin connector and bending the cable lets it easily slightly slip out (THATS WHAT SHE SAID) without you noticing if you don't have a volt meter attached.

i'm sure they will figure out cable and PCB layouts that don't "make" you bend the cable, stronger connects that lock pin connections, etc but at this point you are basically alpha testing
 
  • 1Like
Reactions: 1 user

Jovec

?
738
284
People with ATX 3.0 PSUs aren't having this problem, right?

Surely cable quality matters, but they should still have problems.

The size of the cards require tight radius bends on the cable-to-card connection to fit in most cases. This bend means less contact area for the pins on the outer radius of the bend, so these pins push the same 8+ amps but have less area to spread the heat generated. Keep in mind that the 12vhpwr connector uses 6 12v pins for up to 600 watts of power, where as the old 8pin PCIe plus used 3 for 150, so high end cards had 3x8pin or 9 pins for 12v delivery - IOW, the power and heat is more localized on the new cable..

The spec says to give the cable 35mm of length coming off the card before starting any bend. Given the height of the card this is impractical for most cases. If concerned one work around is the leave the side panel off to give the cable that 35mm of space.
 
Last edited:

Big Phoenix

Pronouns: zie/zhem/zer
<Gold Donor>
44,679
93,389
The problem appears to be that it burns if its slightly out of its pin connector and bending the cable lets it easily slightly slip out (THATS WHAT SHE SAID) without you noticing if you don't have a volt meter attached.

i'm sure they will figure out cable and PCB layouts that don't "make" you bend the cable, stronger connects that lock pin connections, etc but at this point you are basically alpha testing
Jay just did a video doing that trying to replicate these burnung cables and couldnt get it to happen.
 

Brahma

Obi-Bro Kenobi-X
11,987
42,511
Jay just did a video doing that trying to replicate these burnung cables and couldnt get it to happen.

Is it a quick and easy replication of the issue? Or just throwing numbers out there, a few days cumulative...melting before burning?
 

Jovec

?
738
284
I do not think anyone truely knows yet. Buildzoid suspects that the bend is putting excessive strain on the female receptacle, bending the receptacle at the seam and making the connection loose. That, plus the bend pulling the pins on the outer radius out make for loose/less connection area. I would be interested in seeing close-ups of the connector on the card along with the melted cable connector.
 

mkopec

<Gold Donor>
25,407
37,497
Why cant they just relocate the fucking plug on these cards to another area? ITs just shit design and hasd been for a long time. Or make a 90deg connector or some sort of add on 90 piece that goees in the middle of the plugs from the PS and the card? It cant be more than like a .25c piece.
 

Hekotat

FoH nuclear response team
12,052
11,528
I do not think anyone truely knows yet. Buildzoid suspects that the bend is putting excessive strain on the female receptacle, bending the receptacle at the seam and making the connection loose. That, plus the bend pulling the pins on the outer radius out make for loose/less connection area. I would be interested in seeing close-ups of the connector on the card along with the melted cable connector.

Anytime an electrical connector/pin or whatever is loose it generates a ton of heat. We routinely check our electrical cabinets for hotspots using an infrared camera to find all the wires that weren't tightened down correctly. More than likely this is the case.

However, this could be due to pins not being within spec and not fitting properly within each other and the stress on the wire is compounding the issue. With the wattage these cards pull if the card senses it's not getting enough juice it's probably pulling even more power due to the loose connection and making the issue even worse.
 
  • 1Like
Reactions: 1 user

Daidraco

Golden Baronet of the Realm
9,230
9,335
Why cant they just relocate the fucking plug on these cards to another area? ITs just shit design and hasd been for a long time. Or make a 90deg connector or some sort of add on 90 piece that goees in the middle of the plugs from the PS and the card? It cant be more than like a .25c piece.
Ive been asking why they havent put that fucking connector down at the bottom on the END of the card going towards the front of your PC for fucking ages. Running that cable up to the top of that card looks fucking ugly no matter how well you cable manage. Or even just the top of that end if they cant have the current travel from that direction because then its within relative range. Its not like Nvidia gives a fuck about their board partners designs etc.
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,423
22,234
The current itself generates a ton of heat. So running the cable directly into where the current is needed rather than running that current through the body of the card is preferable.

Also, given the sheer amount of current we're talking about, one would think that having a custom length cable and going the shortest possible distance (including removing the PSU shroud, or putting a hole in it, to make it happen) would be a good idea.
 

Droigan

Trakanon Raider
2,496
1,164
So, regarding the PC I ordered. I already changed it from a 12900 to a 13900k, but I asked them if they knew when the ATX 3.0 were available, and they didn't have any or know when they would get any. They said then if I wanted that, I'd just have to cancel my order and purchase it again when it's available.

So, should I do that?
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,423
22,234
So, regarding the PC I ordered. I already changed it from a 12900 to a 13900k, but I asked them if they knew when the ATX 3.0 were available, and they didn't have any or know when they would get any. They said then if I wanted that, I'd just have to cancel my order and purchase it again when it's available.

So, should I do that?
I'm not building anything until some company that doesn't suck puts out a good Titanium ATX 3.0 PSU.
 
  • 4Like
Reactions: 3 users

Big Phoenix

Pronouns: zie/zhem/zer
<Gold Donor>
44,679
93,389
Titanium gonna be worth it with these gpus, especially paired with a 13900k.
 
  • 1Like
Reactions: 1 user

Droigan

Trakanon Raider
2,496
1,164
I'm not building anything until some company that doesn't suck puts out a good Titanium ATX 3.0 PSU.

I canceled it right after I wrote the above. Not taking the chance even if the chance is minimal. Hopefully it won't be too long until they're available.
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,423
22,234
Titanium gonna be worth it with these gpus, especially paired with a 13900k.
You can cut the power consumption on the 13900k by a fuckload, same with the 4090, if you're willing to give up the last single digits of performance you'll never need in any game or in most applications.

A 13900k choked to 90 watts is more powerful than nearly anything ever produced, and at 120w it completely dominates.
 

Kirun

Buzzfeed Editor
<Gold Donor>
18,705
34,875
if you're willing to give up the last single digits of performance you'll never need in any game or in most applications.
That begs the question of why you'd need a 13900k at all then, no?
 
  • 1Solidarity
Reactions: 1 user

Mist

Eeyore Enthusiast
<Gold Donor>
30,423
22,234
That begs the question of why you'd need a 13900k at all then, no?
Not really. Even still, the same logic applies to a 13700K or even a 13600K. Seen people running a 13600K at <30w and still getting ridiculous performance out of it, basically turning their desktop CPU into something that draws less than an H-class mobile workstation CPU.

Further, unlike previous gens, 4090 is likely to be 40% more powerful than the 4080. An 8-15% gap is normal between the flagship and the xx80 series. The 4090 is not that much more money really, and power-throttled it'll likely still beat the 4080 by >30% performance.
 

Kirun

Buzzfeed Editor
<Gold Donor>
18,705
34,875
Further, unlike previous gens, 4090 is likely to be 40% more powerful than the 4080. An 8-15% gap is normal between the flagship and the xx80 series. The 4090 is not that much more money really, and power-throttled it'll likely still beat the 4080 by >30% performance.
Power-throttled to what though? You're power throttling to the same wattage as a 4080 and beating its performance by 30%? Same VRAM and everything? I'd like to see the math on that one.
 

Fucker

Log Wizard
11,580
26,216
Ive been asking why they havent put that fucking connector down at the bottom on the END of the card going towards the front of your PC for fucking ages. Running that cable up to the top of that card looks fucking ugly no matter how well you cable manage. Or even just the top of that end if they cant have the current travel from that direction because then its within relative range. Its not like Nvidia gives a fuck about their board partners designs etc.
That end of the card is all cooling stuff. They could run an extension down to the end, but $.

External power pack would be a good idea at this point. $ though.

I'm of the opinion that the whole mess inside needs to be tossed out and redesigned, including all the dumb plugs and piles of wire everywhere.