It's going to be interesting, from a security standpoint. One of the original cracks on the PS3 was via the USB stack. I'd not go plugging my computers or phones into jacks that I don't necessarily control.
I can see a market for buffering devices that allow power through, and perhaps do power negotiation for you, but that do not allow data traffic. I believe these devices already exist. though I don't know how sophisticated they are.
(I'm not an engineer). I a similar article; iirc android phones could be owned by plugging them into a hostile usb connection. I wonder, though, if you couldn't produce a simple adapter that just drops some of the pins? Maybe I don't know enough about usb.
Speaking as someone who writes embedded software using USB stacks for a living, there are a number of solutions to this. Your solution is possible, and is how many iPhone USB charging cables are designed. If limited communication is required, the device being charged can also contain a separate processor for the interface. The ideal solution is for the device to contain some security measures, to prevent these issues; this solution is also the least versatile or most costly depending on implementation, and may not be achievable.
Most hosts will provide at least 500mA without any signal from a device. Hosts and chargers vary in their current capability; those capable of providing more than standard USB spec vary in their implementation. In general, high current hosts always provide their full capability, and the devices are the ones which require some signal to indicate maximum current available. Other devices increase their current draw until the host's voltage drops, and hold at that level (or less).
Even worse than USB is Firewire, which, if I am not mistaken, gives direct memory access to whatever device is connected.
However, manufacturers are already realising that blindly trusting every USB host is a bad idea. For example, iPhones now ask if you want to trust if you connect to a previously unknown computer via USB. If you chose not to, your iPhone will take power, but won't allow data access. Therefore it's no longer possible to create a device that looks like a charger but that actually downloads all the photos from your phone.
> Even worse than USB is Firewire, which, if I am not mistaken, gives direct memory access to whatever device is connected.
Both FireWire and Thunderbolt offer DMA. There's publicly available tools that allow you to extract passwords, remove lock screens and find keys in the memory of running devices using either type of port.
Plus PCMCIA and ExpressCard, so most laptops are vulnerable out of the box[1], but there are some ways to mitigate those attacks. See https://en.wikipedia.org/wiki/DMA_attack
[1]...if you plug untrusted devices into your expansion slots or leave your laptop unattended.
My favorite FireWire hack was when a guy took a friend's laptop, which did not even have FireWire. He plugged in a FW card at the boot prompt. The computer auto-installed the driver, and the attack proceeded...
Yes, FireWire is bad in this regard, because it normally allows devices direct DMA to other devices memory space. This is also what makes/made it so popular for high-bandwidth/low-latency applications.
But keep in mind that this is something that a host actually has to turn on for every single device on the bus[1], but probably most drivers are lazy and just unconditionally enable this globally.
surely this could be a software fix: tell usb driver to use special mode that is only allowed to use data line to negotiate power levels, before you plug into an untrusted power source.
Now you have to trust the driver. And depend on there not being "test modes" in the hardware that can be turned on (USB 3.0 is a pretty complex beastie; frankly, if I were a spook agency I'd want to have doors in via seldom-examined ports like USB).
That special mode code and negotiation code is what malware authors will target.
For hostile / high security environments, I'd rather have it in a special charging-only hub (good luck pwning such a device) that physically lacks the data wires on the output socket.
This is interesting as a contribution to the solar net metering debate. If you have a local low-voltage DC network, perhaps backed up by a UPS, you can charge the UPS battery using solar, and that means that solar electricity generated on-site automatically displaces grid electricity at the retail rate. That means that solar would only need to compete with the retail price of conventional electricity, and not the generation cost. And solar cost is already at or near parity with retail price in many parts of the world:
One problem with low-voltage DC is that according to Watt, to achieve equivalent power (watts) at a low voltage, current is higher. And heat is proportionate to current squared. So you can't send a lot of power at low voltage because you lose a lot to heat. That means low-voltage runs have to be kept fairly short, or very low power, limiting their usefulness.
Funnily enough USB's inability to work over longer cable lengths works in its favour here. Generally we don't rely on anything working over 3 metres.
Wikipedia says "the maximum power supported is up to 60 W at 20 V, 36 W at 12 V and 10 W at 5 V" [1]. For a typical 3 metre 20 gauge USB cable 10 W power delivery will cause a voltage drop from 5 to 4.6 V. This is within the +0.25/-0.55 specified for USB 3 [2].
Since nobody can rely on the 5 V from USB to be exactly the charging voltage they need, there will probably be switchmode regulators in-line anyway. Device manufacturers will just have to spec these up a bit to accept higher input voltage if they want more than 10 W.
Yeah, it's fine for USB cables. The difficult bit would be if you're planning on cabling an entire office building from one low voltage DC source, like the solar panels the article mentions.
If you want 60 watts at 20 v is 3A, and if you want to supply a hundred ports with that you're going to need copper cables the size of your finger.
A standard is currently being developed for DC distribution in datacenters and commercial buildings, at 380 volts.
One of the proposals for residential DC is to supply 380 volts and 24 volts. The higher voltage would be for things like space heaters and hairdryers, while the lower voltage would be for electronics.
I find it notable that this article entirely ignores the EU's Common Electrical Power Supply law (http://en.wikipedia.org/wiki/Common_External_Power_Supply), which effectively mandated that all smart phones sold in the EU use micro USB for power. This had a swift and noticeable effect in the diversity of connectors in phones (basically Apple is the only maker that doesn't use micro USB and this change happened at the exact time of the law). The emergence of USB as The Way Phones are Charged didn't happen as a magic emergent property, but via considered government regulation. Government: it can actually work.
This is great. Five years ago I carried on trips a phone with a proprietary USB cable, a GPS with mini-USB, a pocket camera with a like-sized charger (and 2m lead, which I refactored), and a AA battery charger for the rest of the stuff.
These days, bike lights charge from mini-USB (or even have integrated USB A plugs), phones are semi-required to use micro-USB, computer mice have USB charging and data, and I recently learned that even pocket UV water treatment devices use micro-USB and integrated Li-ion instead of AAs now. I think the only use I have for AA batteries anymore is a camera flash. One of the requirements when I bought a pocket camera recently was that it charge via USB (Sony gets this; Olympus sort of does but their cables are "special"; Canon has one or two models).
USB power can also do wonders for places where mains power is provided only part of the day. Those USB "power banks" are already taking off, and the more things we can use them with, the better.
We've basically killed off the C and D-size battery, the 6V lantern, and a bunch of other unnecessary form factors. Now let's finish the job--we can make AAAs as obscure as AAAAs if we get USB charging remote controls, and there's no reason we can't make smoke detectors last a full year if we get rid of their antiquated and inefficient 9V packs.
P.S.: America, think about migrating to 220-240VAC outlets someday. The fewer standards, the better. You can make USB wall outlets standard at the same time!
There's a 37.7MB zip and the USB PD specification (328 pages) is packaged within.
In order to ensure shit doesn't melt, they'll have detectable cables for >5V and >1.5A operation. I'm reading the spec to find out how they plan to do cable detection. IC based, or electrical connections, or maybe something else?
Network engineers in campus and enterprise environments have been building a DC network overlay for years in the form of Power over Ethernet. All of those VOIP phones, access points, and security cameras all need DC power with UPS backup and the network closet has become where that power is provided.
On our campus it's reaching the point where every switch we'll be buying will soon be PoE. I imagine many places are far ahead of us on this.
100W at 5V means 20A. We usually recommend 4 A/m2 max for copper conductor sections, so it would require two 5mm2 wires in the cable here. It would feel more like a rod than a cable IMO. Anyone got an idea how they plan to address this? Higher voltages?
Well they're pretty limited in what they can do. Most things you'd plug a USB cable into are not thing you'd want to plug higher voltages into, so that would require device-side voltage conversion which is wasteful in energy, space and complexity.
I suspect that the 100W figure is actually just a pulsed maximum specification. The thing is that current ratings for wires are actually specified as a max. continuous current for a given temperature rise in the conductor, per unit length. So blowing 3-4x the current through a conductor for a very short time (think of flashing a bulb or moving a servo) is not a big deal. You just get a transient heat rise. Also, most applications simply don't require 20A. Even microwaves and kettles stay below the 15A residential fuses. (Although they do come close. I once had a shitty basement suite with an underrated fuse. If I ran my toaster and my kettle at the same time the breaker would flip!)
Almost all your USB-powered devices have voltage converters with varying inefficiencies.
It should also be noted that a switched mode voltage converter can have well over 90% efficiency, even with large changes in voltage.
You should also remember that those microwaves and kettles are getting up to 15A @ 120VRMS continuously, which works out to 1800W. You can verify the actual power output of a kettle by timing how long it takes to boil a liter of water, and calculate power from this time and the specific heat capacity of water.
There are some practical limits to voltage conversion if you want to keep that high efficiency. Probably most important is that your switching frequency shouldn't be as high as it is in most small devices (because high freq. allows you to use smaller components).
In any case, as dfox mentions, most internal voltage level conversions won't be switched, because it adds complexity. They will be some form of linear regulation s.t. they can move between logic levels. That's different than moving from whatever high voltage is on your 150W USB line into a level that won't fry CMOS circuitry. There's a reason that the wall-->DC plug conversion usually happens in a brick on your power cable. Switched mode will be used as sparingly as possible, such as when you also need AC signals rectified, if you need both buck and boost depending on a battery or something, or if you need to be able to modify the control loop dynamically.
> those microwaves and kettles are getting up to 15A @ 120VRMS continuously, which works out to 1800W
Well that's kind of my point. Even at 1.8kW those devices don't need to draw 20A continuous (or even pulsed, because of the fuse). Basically no matter what you're doing, the copper losses are roughly fixed by the hardware. What you can control are heat dissipation and current levels, and it's a lot more fun to play with Ohm's law than try to fight against thermodynamics.
USB delivers 5V power and has 3.3V signaling levels, so any reasonable bus-powered device has to contain some kind of voltage conversion. Most common solution is LDO, voltage limits in USB specification even seem to imply that solution. Designing switched mode power supply that meets all the USB 2.0 (don't know about 3.0 but it is probably mostly similar) requirements on minimal voltage on input, current draw in various states and so on is not exactly trivial.
Who usually recommends 4A/mm? Using the wire parameter calculator below, a 2mm diameter wire is capable of carrying 20A, with a 0.1V drop over a 1m (3") cable. A 2mm diameter stranded copper wire is not entirely unwieldy.
The reason those cables get hot is the same reason that you got them so (relatively) cheap: metal is a significant fraction of the cost of a cable. The manufacturer who produced the cable knew exactly what their customer was looking for, and provided a low cost cable with thin conductors.
Yeah, higher voltages. Can't remember off-hand if this tops out at 12V or 20V at the higher charge wattages, but they're definitely not using 5V for them.
In 2006 China demanded all mobile telephone manufactures to standardise on USB connections for charging and data transfer. South Korea did so a year earlier, requiring 'standardized charging' without explicitly stating USB. [1]
I'm curious if or how this requirement had any impact on charger standardisation. The Chinese market combined with economies of scale for common production models could have outweighed any cost benefits a market for chargers could have brought.
Anyone know how this standard actually works? I think right now Android phones tend to short the data lines and Apple uses some system of voltages to communicate that it's high power, which means that the other device usually gets stuck pulling 0.2 amps. How does this new standard tell the device it can supply 100 watts? And does this mean that iOS and Android will be stuck on 0.2 amps? Can you plug a legacy device on at all?
USB has both power and data lines. This new standard can use data lines to negotiate power delivery, I believe. I'd expect will still be backwards-compatible of course. I would think the only allowable voltage would be 5V, as is currently.
With the new USB Power Delivery spec this can finally make sense to really start thinking about. Being able to power things that are "non-trivial" as far as power goes would make this go a long way. Imagine your sound system being powered off of a very clean DC power source, would be an audiophile's dream.
20V and 100W would make for a lot of nice options for powering things.
> 20V and 100W would make for a lot of nice options for powering things.
Requires a pair of 17AWG stranded wires for a 2m run, allowing 3% cable loss.
For comparison, cat6 is 23 or 24AWG stranded, and US residential power wiring is typically 12 or 14AWG solid core. Smaller numbers are bigger, less flexible wires.
I guess it won't work with microUSB contacts, or at least not at full power.
So I think R45 is a better standard to rely on, that it does have option to carry networked data and intelligently carry voltage to charge usb powered devices.
you can already get RJ45 hubs on the cheap too, ones with 8 ports and such.
Disregarding the flamboyant visions of an electrical revolution, just having a flippable physical interface (and hopefully not in three different sizes, two of which always gets mixed up) would be a huge enough leap forward for everyday life.
They forgot to mention the most important thing about the system from Moixa: It's variable voltage and devices generate the needed stable voltages themselves.
I can see a market for buffering devices that allow power through, and perhaps do power negotiation for you, but that do not allow data traffic. I believe these devices already exist. though I don't know how sophisticated they are.