I think the 50A is a bigger issue than the 600W. Consider that the latest spec for usb-c allows up to 240W in a much smaller cable, but does so at 48V, meaning that the current is only 5A.
To be clear - the typical temperature rating you’ll find on mains power cabling in a house is 60c.
The highest you’ll typically be able to find is 75c.
70 Celsius is NUTS for that usage, not ‘oh, that’s ok then’
It's easier to spec out heat resistive materials for a 5cm cable than for the entirety of a house's mains power cabling. Sure, 70°C is a lot, but I think the point is that it would work if it had to.
It’s a sign of a lot of
resistive heat losses and requires specialized insulation and connectors at that level (same for mains). Typically insulation is starting to weaken or even melt at that point.
It’s well outside normal expected operating temperatures for wire.
Crazy high resistive heating when driven at multiple factors above the rated power isn't cause for alarm though. If you put 50A through your wall outlet you wouldn't be surprised when it got hot. The only application where pulling 1500W through a 12VHPWR connector is even close to possible is when using cryogenic cooling (when the conductive cooling will keep the power cable below ambient despite how many amps are crammed through them).
This data point isn't exhaustive, but it does indicate that the actual safety margin is not as tight as is assumed from this news cycle.
You and I have a very different concept of safe? I don’t think it’s saying what you think it’s saying.
That brand new connectors and wire that are custom specced wouldn’t catch on fire (but come close!) at only 2x the power draw means it’s very likely any damage to a connector, wire, etc. especially with normal wire and connectors likely would cause damage and a fire, at much less than that power draw.
Which is what we’ve been getting reports on.
And that is just from minor physical damage it seems,
ignoring corrosion or fraying wires over time which is usually the bigger problem.
That is not an issue with the connector though. That is the adapter. That is my entire point. The connector is safe. Shoddy adapters are the problem. Putting multiple supplies in parallel is always very shady, doubly when it's high current through a small piece of metal. That's what's dangerous, not the 12VHPWR connector.
That has yet to be shown in the field. We know of one incredibly dodgy adapter, but the issues I’m calling out take time to show up, and lead to issues as well.
The truth is the plug is more demanding and expensive than lower density options and it's dangerous when corners are cut. This is always true in power electronics, but the risk surface area is higher with a new and demanding connector.
I'm not quite sure what your point was in the first sentence - that is always better, yes. We don't have access to analysis done by the EEs in charge of making the plug decisions, but we're already seeing high profile failures, which is unusual.
Even with connectors and cables without those early problems, there are often longer term problems that show up over time - cables fraying at the connectors due to movement, plugs and connectors building up corrosion (and hence having higher resistance), connections loosening or getting bent, etc.
The link you posted, and your later statement seems to support the same view. I'm just repeating it so the folks dismissing this as just an issue with the badly designed adapters Nvidia distributed (in some cases?). It's overall a connector and spec that's on the edge of the performance envelope with some 'obvious' types of field failure modes that don't seem to be properly addressed.
Honestly, I'm a bit shocked they went with 'mountains of parallel power feeds' solution instead of... I don't know, doing 6 gauge stranded, which seems like the obvious choice to me? Parallel power feeds like this are always the source of endless fussing and headaches due to exactly the problems we're seeing. Trying to save a couple cents by using more of the same materials they have on hand I'm guessing?