Posted by _MT_
- June 02, 2020, 15:36:51
Calling electrodes just connectors is misleading. They participate in the chemical reactions. That's why specific materials are used and not just any old conductors. However, your description of what happens during charging and discharging clearly illustrates you have no clue.
And the same is true for your description of why charging slows down. What I'm going to write applies mainly to Li-ion chemistries as they belong on the fussier end of the spectrum. Cells are typically charged in two stages: first constant current and then constant voltage. When a cell is discharged, it has a relatively low voltage. If you were to apply its maximum charging voltage like you can with a lead-acid cell, it would result in a huge inrush of charge and probably destruction and fire. This is because they have low internal resistance (they allow very large currents). So, you need to initially limit the current. The bigger current you allow, the faster the charging, but also the more stressful and damaging. As a cell charges, its voltage rises. The bigger the current, the faster the voltage rises. Sooner or later, you're going to hit the maximum voltage limit. That's the point where charging starts slowing down. You now have to limit the voltage and current is going to die down as charging continues. You're essentially forcing electrons into a cell against its internal pressure. The higher the state of charge, the higher the internal pressure. Which means the difference between charging pressure and internal pressure decreases in the second stage resulting in the slowdown of flow. Hydraulic analogy to electricity isn't perfect, but it's easy to grasp.
No, Li-ion cells are generally happiest somewhere between 45 and 50 %. That's where they should be for storage. Low self discharge, slowest aging. The further you go away from there, the more they suffer. It's not a linear relationship, it's one of those bathtubs. 40-60 is great. 30-70 isn't much worse. 20-80 is roughly where you should aim to stay in normal operations. Still good, but you're roughly at the knee of the curve. 10-90, if you really need that extra capacity and can't get a bigger cell. Anything beyond that really should be exceptional use. These are just rule of thumb guidelines. One problem being that manufacturers decide what those numbers mean. And while it's not desirable to fully utilize the capacity, manufacturers count with full use. The cell should last as manufacturer declares. It's just that you can significantly prolong its lifespan, preserving capacity and potentially reducing waste and expense.
Also, time matters. You should aim to minimize the amount of time spent outside of the recommended range. You can discharge your phone to, say, 5 %. But you should charge it immediately. And you shouldn't keep a phone sitting on a charger at 95+ % for half a day or night. If you need a full charge, then it's best to charge before you leave. This is one danger of wireless charging. When I'm working, I have a wireless charger available to me at least 15 hours a day. And sometimes it's over 20. I have a charger at home, in my office, in my car. And when I'm out, I'm usually not playing with my phone. For me, it makes no sense to charge over 80 %. Or even 70 %. With such frequent opportunities to charge, you need different approach to charging management.
Actually, if you're draining your phone close to dry and fully charging it practically daily, you can easily use up all the declared cycles within three years. Even with a top notch 1000 cycle cell. That should actually last you about three years in those circumstances as long as you avoid abusing it. And certainly with crappy 300 cycle cells. Those should be ready for replacement within a year in this scenario. If you belong in this group, you should avoid cells with lower cycle count like a plague. Or be prepared for relatively frequent replacements.
Since we're talking 30+ W these days with fast chargers, they certainly can have an impact on lifespan. 30 W would be fine for a big laptop battery. But it's a different story for a 4 Ah cell in a phone. Essentially, charging is relative to capacity. Simply put, as long as you can supply enough power, a big battery should charge just as long as a small battery. Yes, the cell can be built to better withstand it, but it doesn't change the fact that it's more stressful and damaging. And don't count on them accounting for the use of fast chargers in their lifespan numbers. Watch out for the small print. I imagine that's one reason even some flagships don't ship with a fast charger (or the fastest charger supported). If they supplied the charger, they would have to adjust the lifespan numbers or be open to litigation. It's hard to claim it was user's choice and now he has to pay the price when the charger was provided as standard.
It's called a charger because it's used to charge a battery. Technically speaking, it's an external power supply. How it actually works internally (voltage conversion, rectification) is irrelevant. We usually talk of power supplies when it comes to devices with a more permanent placement. Mobile, battery powered devices are intended to be just that - battery powered. Power supply serves to charge the device hence the name charger. Charger doesn't imply logic. It implies use. There are plenty of "dumb" chargers that are just power supplies. Some chemistries tolerate simple minds better than others. And some have short temper. If you're charging a device, the device typically contains any charging logic. Chargers contain it if they're used to charge cells or batteries directly, not devices.
Oppo's use of two cells allows them to supply higher voltage. Higher voltage for the same power means lower current and hence less resistive heating. However, higher charging rates are stressful even with good cooling. Yes, if heat is a problem, it should be better relative to other similarly fast solutions. But there is no way they can match a slow charger.
As already explained, all phones charge their cells using constant current. Charging circuits like all power supplies produce heat. Because they're not 100 % efficient. The idea is to do as much of the work outside as possible (thereby producing the heat outside of the device). External power supplies are typically voltage sources. They supply voltage and let charge flow. Up to a limit. But what you actually need in the first stage of charging is a current source. A power supply that gives you given flow (and no more) whatever it takes. Again, up to a limit. In practice, it can be a voltage source with current limiting. This is something that a device normally has to do internally. I'm not familiar with VOOC but I imagine this is where it steps in. Rather than doing the work internally, they tell the charger how much current is needed. But that's only a guess. There is more than one way to skin a cat, as the saying goes, but the general idea is the same - move work outside. I believe PD3.0 and QC4+ also have variable current limits. So you should be able to do the same thing with them.
The cable always has to be rated for the voltage and current the brick can supply. Always. That's basic physics. Otherwise, you're creating a fire hazard. Unless you know what you're doing, the cable should be intended for the particular charging standard. Manufacturers are typically keen to brag about compatibility so it shouldn't be a problem. I believe VOOC in particular uses an extra pin to facilitate communication with the charger. So it requires a special cable. Otherwise, it's going to run with a lower current limit. There is nothing inherently more dangerous to "controlling current", whatever that means. Whether we're talking about cables or cell life. In a closed circuit, there is always voltage and current. However you got there. Magnitude is what matters.