Transitioning to Fully Renewable Electricity: A Feasible Goal?
Written on
The journey toward a fully renewable electricity supply in the UK is rooted in a rich history of scientific and engineering advancements. Pioneering efforts in the 1800s laid the groundwork for a reliable electricity system that is now integral to daily life. Today, blackouts and brownouts are rare occurrences in the UK, thanks in part to the management of the national grid, which must now evolve to accommodate emerging technologies.
A glimpse into the history of science and engineering reveals that the initial attempts to electrify residential areas began in Surrey and London during the 1880s. Surrey's generator harnessed water to drive a turbine, while London's Holborn Viaduct station became the first globally to produce electricity via coal-fired steam turbines. As additional generators came online, the landscape grew complex; by 1918, London had ten distinct electricity suppliers, resulting in varied and inconsistent electricity outputs, necessitating that devices be compatible with specific supplies.
These early generators utilized turbines based on Michael Faraday's 1831 discovery of electromagnetic induction, where a copper wire coil generates electrical current in a changing magnetic field. The rotating turbine creates this magnetic field, with voltage levels contingent on its strength. Debates in the early electricity era focused on the choice between alternating current (AC) and direct current (DC). DC could be achieved by passing wires through one magnetic pole, while AC required both poles. The decision favored AC for its efficiency in long-distance transmission.
To effectively transmit power over distances, minimizing resistance is crucial. Higher current results in greater resistance, so to maintain efficiency, high voltage and low current are desired. The equation P = IV illustrates this relationship. Transformers facilitate this voltage-current conversion with AC. However, the adoption of AC led to varying frequencies and voltages as different independent generators operated.
Eventually, electricity standards were established. The frequency selection for AC was influenced by the advent of electric lighting, which emerged concurrently. Early incandescent bulbs utilized resistance to heat filaments, and frequencies below 50 Hz would cause noticeable flickering due to filament cooling. The choice of 50 Hz also aligned with optimal steam turbine performance at 3000 rpm, factoring in the limitations of mechanical wear and stability. The calculation of this frequency stems from multiplying the turbine speed by the number of magnetic poles, then dividing by 120 to account for the nature of magnetic pole pairs. Today, Europe hosts the largest synchronous grid, delivering 50 Hz electricity to approximately 400 million users across 24 nations.
Managing the frequency of the National Grid is vital to balancing electricity supply and demand. A frequency drop exceeding 1% signals instability, often occurring during peak demand periods, such as evenings when households power various appliances.
Despite advancements, steam turbines remain substantial machines; once they reach the desired rpm, their inertia becomes advantageous. This inertia buffers against sudden demand spikes, maintaining a steady 50 Hz supply and allowing the National Grid time to adjust by activating additional generators or drawing from storage or imports.
However, renewable sources like solar panels and wind turbines lack this inertia. They generate electricity only when conditions permit, leading to potential instability in the grid when coupled with fluctuating demand. Consequently, innovative solutions for providing inertia are in development.
Currently, approximately 60% of the UK's electricity is derived from steam or gas turbine generators, which supply the necessary inertia for short-term demand fluctuations. Solutions such as flywheels—large, spinning disks that store energy—are being explored. When electricity supply surpasses demand, excess energy can be utilized to accelerate the flywheel. Additionally, grid-forming converters are being tested to enhance inertia.
In the present, strategies for managing the electricity grid include keeping steam turbine power stations operational continuously, even during low demand periods. Some industrial facilities, like sewage treatment plants, often run overnight when electricity costs are lower, effectively utilizing surplus capacity. Variable pricing encourages electricity use during off-peak times.
This surplus can also replenish pumped hydro storage systems, which have been in operation for about a century. These systems store water in reservoirs, releasing it to generate electricity during peak demand periods. Most viable reservoir sites are already utilized, limiting further expansion.
Looking ahead, adapting a national grid founded on Victorian scientific principles to accommodate intermittent renewable sources will require time and technological innovation. If the early pioneers could foresee the future, would they have opted for different infrastructure, perhaps favoring DC distribution over AC? Given that solar panels produce DC and many modern devices also operate on DC, a local grid system utilizing DC, supplemented by a larger grid, could enhance efficiency by reducing conversion losses.
This narrative stems from discussions featured on the podcast "Technically Speaking," where scientific and engineering topics are explored through engaging conversations that blend factual insights with imaginative speculation. New episodes are released bi-weekly on various platforms including Apple, Spotify, and Google.
To stay connected and contribute to the podcast, follow us on Twitter. If you wish to support this blog and podcast, consider becoming a Medium member for $5 monthly, which grants unlimited access to all Medium content, helping fund further episodes and discussions.