Tuesday, March 14, 2017

DEEPMIND ALGORITHMS JUST THE START OF A CONSUMER FOCUSED REVOLUTION IN THE POWER SECTOR.



Deepmind algorithms to manage the Grid could be just the start of a consumer focused revolution in the power sector. The need to manage much more complex low carbon systems means there are strong incentives to manage consumer demand more pro-actively. This could be good news for consumers, offering them more choice, and also defusing some of the concerns that sit around supply security.

Yesterday’s FT reports[1]: Google’s DeepMind is in discussions with the UK’s National Grid to use artificial intelligence to help balance energy supply and demand in Britain.  “… It would be amazing if you could save 10 per cent of the country’s energy usage without any new infrastructure, just from optimisation. That’s pretty exciting,” Demis Hassabis, DeepMind’s chief executive told the Financial Times. National Grid’s role in balancing the system has become more difficult in recent years, however, as intermittent renewable sources of electricity — such as wind and solar power — have become a bigger part of Britain's energy mix. DeepMind’s algorithms could more accurately predict demand patterns and help balance the national energy system more efficiently.

This is currently a task that is at least partially delegated to the market. The principle behind most “spot” wholesale markets is that generators declare their marginal costs of generating (per kWh unit of energy produced) and are then selected to run in ascending order of cost (the so-called “merit order”), with the cheapest chosen first, and the most expensive plant that runs setting the price. That principle will be increasingly dysfunctional or inapplicable in the real world, partly because such a high proportion of current and future generating plant has zero or negative marginal costs of operation, and partly because the operational efficiency constraints on the power system are becoming more complex, involving considerations of plant inflexibility, intermittency, and energy storage, rather than just a simple stacking by ascending cost. Sophisticated algorithms are prima facie exactly what is needed to replace a defunct merit order.

This implies moving beyond prediction of demand patterns, for which fairly sophisticated approaches already exist, and addressing predictions of intermittent supply as well. It also means developing algorithms to make operational decisions that make sense in terms of efficiency and the secure operation of the system. The promise of a 10% saving in energy may be an exaggeration, not least because of the dominance of capital costs, and relative insignificance of fuel, in low carbon generation. But the bigger contribution of an algorithmic approach lies in the broader options it creates for the ways that the power system is managed and the ways in which consumption is managed. This could allow leaner systems and also transform the way that we think about electricity as a service.

Future Options

The conventional utility model has consumers able to treat electrical energy supply as “on tap”, with limited or no differentiation between applications (e.g. as between lighting, heating or mechanical power). Tariffs and prices for the most part approximate to an averaging of the costs of supplying electricity, with limited ability to differentiate on grounds of differing incremental costs, and a common security standard for all consumers and all applications. 

Consumer behaviour needs to be incorporated as a much more active component.  What is needed is to redefine the “consumer offering”, with electricity as a set of services, rather than a homogeneous commodity. This requires starting with a clean sheet in defining the nature of the services that consumers will want, and the basis on which they pay.  So, to take a particularly dramatic example, a consumer wanting to charge electric vehicle batteries might request 75 kWh to be delivered in a specified period, over several hours or even several days (eg a weekend), and the consumer’s terms of supply might specify that this requirement will be met in full but with timing that is “at the supplier’s discretion”.  Different arrangements and different tariffs could apply to the purchase of power for heat, and for some other uses, reflecting in each case the nature of the load, the extent to which it could be time-shifted without inconvenience, and the level of reliability for which the consumer was willing to pay.  Commitments to individual consumers would be made by energy service companies who would be able to aggregate consumer requests and feed them in to become part of the Grid’s system optimization routines. Such services might even be packaged with the provision of appropriate equipment (eg storage heaters).

The role of suppliers is then to act as aggregators, and their essential function would be to manage the complex interaction between consumer loads and system balancing requirements, including shaping and managing the pattern of consumption. This provides a major opportunity for a much more innovative approach to all aspects of metering and for the terms on which consumers purchase power. Suppliers could at the same time enter into individual contracts with generators, or a system operator or other agency, which would reflect the economic benefits of their ability to shape consumer loads. They would also take responsibility for managing loads within network constraints at lower voltages, ie within local distribution networks.

This has some powerful advantages.  First it allows consumers to purchase power for particular usages in ways more akin to their purchase of other goods and services, as opposed to perpetuating the “instantaneous commodity” characteristics that have hitherto been a unique and constraining feature of the power sector. This can reflect what consumers actually want and need from a utility.  At the same time it would help make the services more affordable.  Consumers could still choose to take some power “on tap” and would normally pay a higher price for this.[2] Many of the issues associated with administrative setting of security standards would become much less significant. Security standards would be chosen in a market, not dictated by a central authority.[3]

This change is enabled by one set of technologies – those that surround metering, remote control, and system optimisation (Deepmind).  But it also helps to resolve the problems posed by another set of technologies, those linked to intermittent or inflexible sources of non-fossil generation and distributed generation.

………..

These ideas have also been explored by the author in Double standards for reliability in power supplies. Not such a bad idea. This was a defence of a controversial proposal from Andrew Wright of OFGEM on a proposal for consumers to choose the level of reliability that they want. They have been presented in a broader context in a paper, Markets, policy and regulation in a low carbon future, produced by the author for the Energy Technologies Institute (ETI), which  published a number of perspectives on low carbon futures in 2016.







[1] DeepMind and National Grid in AI talks to balance energy supply. FT 12 March 2017
[2] “Electricity Markets and Pricing for the Distributed Generation Era”, John Rhys, Malcolm Keay and David Robinson. Published as Chapter 8 in Distributed Generation and its Implications for the Utility Industry, ed. F. Sioshansi, Elsevier, August 2014.

No comments: