Why Have Non-Zero Timing on a BLDC?

Advancing timing is a practice common to electric motors and internal combustion engines. The purpose is to increase efficiency. In other words to maximize the power out for a given power in.

In electric motors, the amount of torque produced in relation to the rotor field vector with respect to the stator field vector is given by:

\$\tau = \tau_{max}~sin~\theta\$

Where:

\$\theta =~\$Angle between the two field vectors

When \$\theta = 0°,~\tau = 0\$ (no torque means no movement) and when \$\theta = 90°,~\tau=\tau_{max}\$. For all other angles between 0° and 90°, \$\tau\$ is some percentage of \$\tau_{max}\$.

The problem here is that as the rotor spins, the interaction between it's magnetic field and the stator's cause the fields to distort and move from their normal non-rotating positions. The faster it spins, the more the fields distort. The best picture I could find of this phenomenon actually comes from the Wikipedia article on brushed DC motors. The the principle is the same for brushless:

distorted magnetic fields

By advancing the timing, you are ensuring that commutation occurs when the two fields are at 90° to one another in order to maximize torque production at maximum speed. However, since the position of the fields will change with speed, this timing advance is only good for one particular speed in one particular direction. For all other speeds your efficiency will be less than optimal at the angle between the two fields decreases from 90°. And for the reverse direction, you will be much less than optimal requiring much more current to produce the same amount of torque.

Depending on your requirements, a 0° timing advance may not be such a bad thing. If you need to be able to reverse direction, but don't care as much about power consumption, maximum speed, or maximum torque, then a 0° timing advance may be a good compromise. However, if you need to produce maximum torque at maximum speed without drawing excessive current. Then advanced timing is a must.


A note on what causes the distortion

The distortion occurs because of the laws discovered by our friends Lenz and Faraday. In a simple motor, you have a coil rotating in a magnetic field:

simple motor

As current passes through the coil, it causes a generated magnetic field around the wire. As the generated magnetic field interacts with the static magnetic field, their forces push on one another and the fields distort:

field interaction

As the coil rotates, it moves in and out of the magnetic field. When the wire is in the magnetic field, the field distorts. When the wire is out, the field snaps back to normal. This snapping back take some amount of time. As the coil rotates faster and faster, the field has less time to snap back to normal. So the faster the motor turns, the more distorted the field remains.


Somewhat related

I sometimes find that people have an easier time understanding internal combustion engines as opposed to electric motors. Maybe it's because people have a better understanding of explosions versus rotating magnetic fields. Or perhaps because gasoline cars are still so much more common. If you're one of those people, have a look at this How Stuff Works article. It explains the reasons behind advancing the timing in an internal combustion engine. There are a lot of similarities between the two and the analogy may be helpful to your understanding.


The torque generated by a motor is a function of the difference between the angle of the magnetic field generated by the coils and the field generated by the magnets. Because the coils' magnetic field cannot respond instantly to changes in voltage, the angle of the field generated by the coils will essentially represent what the controller was requesting a short time previously. As the motor speed increases, that lag represents an increasing angle, to the point that the angle between the coils and the magnets diminishes and with it, the ability to generate more torque.

Adding an offset of, e.g., 5 degrees to the sensors would have the effect of increasing the angle between the motor and the coils by five degrees when the motor was moving in on direction, and decreasing the angle when it was moving in the other. This may thus make the motor work more effectively in one direction, but less effectively in the other. Note that because the magnets are switched in discrete steps, the angle difference at rest may vary between 30 and 90 degrees when there is no offset. Adding a 30-degree offset would cause the angle difference to vary between 60 and 120 degrees in one direction (good), but between 0 degrees and 60 degrees in the other (bad). Note that if the angle difference is zero degrees, your motor will try to stay in its current position instead of moving--oops.


To literally answer the question, a "ton of current" would be drawn in the direction where the timing offset was pathologically backwards from what was needed. Instead of synchronizing with the rotational state and doing "work" only to overcome losses and shaft load, the mis-synchronized drive would end up to a large degree fighting it's own efforts at maximum power input - not entirely unlike like trying to drive the motor with the rotor locked so that it cannot turn. The motor might still turn, but it would be extremely inefficient since most of the power applied at any instant in time would be fighting the existing state, rather than modifying that state only slightly to transmit power to the mechanical load.

Offset sensors (or to a degree, even having sensors rather than measuring back EMF) tends to point towards an older, less internally sophisticated controller. A modern microcontroller based design could handle the offset in software, and apply it appropriately for either direction of rotation.