It was slightly less than an year ago that I have vented some doubts about USB chargers and a few more I have now. As I said last week, I changed the ROM on my Milestone and thanks to Robert I have also re-calibrated the battery, with the phone now lasting over a day with a single charge (terrific!).
When doing the calibration, it was suggested to use Battery Monitor to check the status of the battery during the process. The widget itself is quite nice, actually, and has one nice feature that estimates current flow in the device: negative while discharging, positive while charging. This feature is what made me even more doubtful about general usefulness of USB chargers.
I mostly use two USB chargers for my phone: the original one from Motorola, rated at 800mA, and the one I got for my iPod when I bought it a few years back, rated at 1000mA (1A). When I use the Motorola one, the widget shows just shy of 500mA of positive flow… when I do the same on the iPod charger, it shows around 200⁄300. Given the iPod one should have more power than the Motorola one, it shows that something’s wrong.
I remember reading a technical article a few months ago about how Apple enforces their “Made for iPhone” brands on chargers by limiting the amount of current it would require of a charger depending on specifics resistance value over the data lines of the USB port, so that a number of chargers don’t even reach the power of a standard USB port (500mA) when used with an iPhone. Now I’m wondering whether the problem here is that Motorola did the same or if it’s the iPod charger that also tries to “validate” the presence of an iPod on the USB connection. Either way, the option sucks.
It is funny to think that there are so many specifications nowadays that calls for an universal charging solution – just look at this Wikipedia article – and yet nothing seems to stop manufacturers from imposing artificial limitations for the only reason to sell you their own charger!
Of course, simply relying on two chargers, and even more importantly, on the reading of a software application estimates, is no way to draw proper conclusions. The proper course of action, which I wish I had the time to pursue already, would be to add an ammeter in the chain, discharge the phone, then look at what’s really going on in term of current flow during the charge process. My original intention was to add the ammeter after the charger and before the adapter, using male and female USB Type A ports, but nowadays I’m doubtful. Since the European cEPS requirements don’t include the use of a USB Type A charger, but simply of a microUSB connector, it seems like Samsung took the opportunity to provide its users with an old-fashioned charger, where the cable is captive and microUSB is only the connector option.
Given both Samsung and Motorola use Android these days, it wouldn’t be a fair comparison if the two chargers weren’t cross-tested with the other manufacturer’s phone, but that also requires that the ammeter is added in the microUSB chain… option that would disallow testing charging of iPhone and iPod devices since they use the dock connector, and not microUSB.
Any suggestion on how to realise the hardware needed is very welcome, as I’ve already demonstrated I’m not that good an electronics person.
http://www.fluke.com/fluke/…a meter that uses the Hall Effect for non-intrusive measurementsthey aren’t cheap but something like this should be available as one option
The USB 2.0 specification limited the power that can be delivered over a USB cable to 500mA, and 100mA without negotiation. Various manufacturers (Apple included) created their own extensions to USB to permit 500mA or higher current charging from dumb chargers (i.e. those which do not present a USB host to negotiate with). Some devices just ignored the USB specifications and went at 500mA if the host didn’t negotiate (and sometimes even if it did, causing the port to shut down).Modern USB chargers should all be conformant to the USB Battery Charging Specification, which permits providing 0.5A of power to be 5V and up to 5A at no more than 5V (Though a device may draw no more than 1.5A). This is also implemented by many modern hosts; for example, recent Macs implement 1A host ports by using the charging specification. My understanding, though I am having trouble finding information elaborating on this, is that the maximum power draw is reduced when communications are in progress, presumably for noise reasons.My understanding is that modern iDevices follow the USB Battery Charging Specification, just like everyone else does. They additionally support legacy chargers, however.
For what I gathered, iPhone/iPod still don’t follow those specs — and most definitely my iPod (being the “first generation” Classic) doesn’t.