Staying in Phase On The Grid

Before connecting a generator to the grid, they spin it up to more or less the right speed. Then they hook what is basically a voltmeter between a generator phase, and the corresponding line phase. They adjust the generator drive until the observed voltage is
a) very slowly changing (frequency difference below some threshold) and
b) drops below some low voltage threshold (phase difference close enough so the power flow that results when they throw the big switch is manageable).

Once the generator is connected to the grid, it always stays in phase. If not driven mechanically, it will act as a motor. The amount of power it draws from or exports to the grid is controlled by how hard it is driven mechanically.

Each generator is connected to its local part of the grid, synced to its local frequency. There will be a slight phase difference between the generator and the local grid. If the generator is supplying power to the grid, its phase will be slightly in advance. The larger the power input to the generator, the larger the phase difference, and the larger will be the power exported to the grid.

This 'power flow follows phase difference' extends to whole areas of the grid. If there is a large load in the south, the generators in the south will slow down initially, retarding their phase with respect to the north. This phase difference will create a power flow from north to south.

Where you have a nationwide grid, the management strive very hard never to let any significant part become 'islanded' from the other part. Once they drift apart in phase, it may take a long time before they can be brought together again, as the phase matching will need to be exquisitely accurate to avoid a huge power flow at the time of connection.

Where two separately controlled grids are to be connected, say by the Anglo-French undersea cable, it is done with DC. It is easy at the receiving end to synchronise the inverters to the grid.

Keeping the grid in phase with an average of 50 cycles per second over the course of a day, is simply done by feeding in more or less power, to speed or slow the grid frequency respectively, usually at night when there's a bit more slack in the demand.


You're confusing an accurate number of cycles over a 24 hour period with very rigid instantaneous frequency control. That's not how it's done in most places.

The frequency is maintained at around its nominal frequency by matching generation to load - all the time that the load is greater than the generation, the frequency will be (very) gradually falling, and all the time the load is less than the generation the frequency will be increasing.

The inertia is enormous and, in general, both load and generation change fairly gradually, so there's lots of time to make adjustments to generators (or loads, where people have contracted to control their loads in this way) to keep the system balanced. The frequency is allowed to drift between various limits (operational and regulatory).

In the UK at least, the correct number of cycles per day is maintained by keeping track of 'real time' and 'grid time', and the grid is run a bit fast or a bit slow to make sure they don't get too far apart.

There are accurate frequency references in use within the grid control system - that's what they're comparing with/measuring against, but the grid itself isn't phase/frequency-locked to them in any direct way.

At the bottom left of the big display in this image is a graph with a vertical wiggly yellow trace - that's the frequency of the UK National grid for a while before the photo was taken - as you can see it's not locked to anything very tightly, though the graph is probably only about ±0.3 Hz.

enter image description here


They use a Synchroscope. I have seen this done in power plant control rooms.

https://en.wikipedia.org/wiki/Synchroscope

enter image description here