Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One problem with low-voltage DC is that according to Watt, to achieve equivalent power (watts) at a low voltage, current is higher. And heat is proportionate to current squared. So you can't send a lot of power at low voltage because you lose a lot to heat. That means low-voltage runs have to be kept fairly short, or very low power, limiting their usefulness.


Funnily enough USB's inability to work over longer cable lengths works in its favour here. Generally we don't rely on anything working over 3 metres.

Wikipedia says "the maximum power supported is up to 60 W at 20 V, 36 W at 12 V and 10 W at 5 V" [1]. For a typical 3 metre 20 gauge USB cable 10 W power delivery will cause a voltage drop from 5 to 4.6 V. This is within the +0.25/-0.55 specified for USB 3 [2].

Since nobody can rely on the 5 V from USB to be exactly the charging voltage they need, there will probably be switchmode regulators in-line anyway. Device manufacturers will just have to spec these up a bit to accept higher input voltage if they want more than 10 W.

[1] http://en.wikipedia.org/wiki/USB_Power_Delivery_Specificatio... [2] http://en.wikipedia.org/wiki/USB


Yeah, it's fine for USB cables. The difficult bit would be if you're planning on cabling an entire office building from one low voltage DC source, like the solar panels the article mentions.

If you want 60 watts at 20 v is 3A, and if you want to supply a hundred ports with that you're going to need copper cables the size of your finger.


A standard is currently being developed for DC distribution in datacenters and commercial buildings, at 380 volts.

One of the proposals for residential DC is to supply 380 volts and 24 volts. The higher voltage would be for things like space heaters and hairdryers, while the lower voltage would be for electronics.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: