Looking for more articles on power? Create your own custom news feed.

Get started
article

Six big data challenges for the power industry

Save It Saved
IOT_2

The digital transformation of energy will step up a gear in 2018, with advances in the Internet of Things and big data providing utilities with the chance to reinvent themselves

.............................

If you bet your business around natural resources, your costs will invariably go up over time. If you bet on technology, they will go down.

A wise product manager once gave me that advice and it’s as applicable today as it was then. The utility and power industry is in the midst of the biggest disruption in decades, if not the biggest since the ‘War of the Currents’, and it’s has been ushered in by changes in technology.

Renewables and smart grid technologies have upended assumptions about capital planning, centralized vs. decentralized generation and the underlying foundation of the business. In 2004, the International Energy Association’s annual report on the future of energy didn’t mention solar at all and predicted renewables would only constitute 6 per cent of capacity by 2030. The latest report predicts that solar will lead in new capacity and that wind could become Europe’s leading source of power.The digital transformation of energy will step up a gear in 2018, with advances in the Internet of Things and big data providing utilities with the chance to reinvent themselves

The change has, of course, been driven by advances in technology: every percentage increase in efficiency or performance leads to a doubling or tripling of demand. Solar is 25 per cent of its cost in 2009 and is slated to drop another 66 per cent by 2040. Offshore wind is slated to drop 71 per cent (47 per cent for onshore). Hardware and software, meanwhile, allow power providers to squeeze out as much potential as possible. In wind, that means capacity factors rising from the 25 per cent range to 41 per cent and beyond.

Innovation isn’t exclusive to renewables either.  Natural gas-generated power has declined by 30 per cent in the last decade as leading turbine efficiencies have climbed from 58 per cent to 64 per cent, said Mitsubishi Hitachi Power Systems Americas chief executive Paul Browning at POWERGEN International last month.

But technology also creates its own challenges. Solar and wind are intermittent resources, which require additional resources for stabilization. Distributed energy resources (DERs) also create competition. Companies can accidentally waste millions because of a wrong turn, or incorrect assumption. Below I’ve listed some of the challenges that we will face as an industry in the future.

There will be far more connected devices and more data than you think

The IoT market is perhaps the first where everything seems to be measured in the trillions. Gartner predicts that IoT will deliver $1 trillion in economic value annually by 2022. UC Berkeley’s Alberto Sangiovani-Vincenelli sees a world populated by 7 trillion sensors by 2025. IDC predicts that by 2019 51 per cent of the nodes on the Internet will belong to machines talking to other machines with a million new devices will be logging on every hour.

While these numbers might sound astronomical, they are all likely low. The cost of sensors and communications continues to plummet thanks to Moore’s Law and as it drops developers are finding more and new ways get value out of talking to systems. The virtuous cycle of development will be similar to what we saw with distributed computing boom in the 1980s.

Likewise, the data generated by these devices will grow exponentially. A ‘smart’ building generates on average 250 GBs a day. A single household smart meter can generate 400MB a year. If you multiply that by the 135 million meters in the US it comes to 54 petabytes, or a little more than half of the data uploaded to YouTube a year. And that is for uploads every 15 minutes: if you started reading every 30 or seconds or even less to better fine tune power forecasts or conduct demand response at the appliance level, you’re moving into the Exabyte territory. Critics will say you can throw most of it away, but it’s impossible to determine what should get thrown away. Bigger will be better.

That will prompt new ways of looking at analytics

A rising tide of data will also, potentially, mean a rising tide of broadband expenses. How and where to use cloud analytics versus local engineers and computing resources will become one of the major challenges in the near term. Serving up all of your data to the cloud, in most cases, won’t make any sense. Likewise, going ‘all cloud’ can increase latency or the risk of network outages. At the same time, the cloud is inevitable: being able to spin up thousands of servers at will opens to door to solving uncertainties around capacity planning or downtime that can we just took as a fact of life just a few years ago.

The good news is that we’re going to discover that people are often more talented than we give them credit for. A large percentage of ‘analytics’ can be solved by giving a good team of engineers and technicians access to information. If you plot a set out outages against a map, there’s a good chance they will be able to figure out a plan of action faster and better action plan than a bank of servers running the latest AI applications.

In the future, we need to support the past

A data centre server has an average lifespan of three to seven years. Notebooks get replaced on four-year lifecycles. By contrast, the average age of a transformer is around 40 years. The bulk of oil refinery capacity in the US harks back to the 1970s. (And for the water industry, that would be young: half of Philadelphia’s water mains date back to the 1930s.)

To take full advantage of digital transformation, large power consumers and utilities will have to develop strategies that will effectively allow them to add IoT gateways and new sensors without ripping out their old networks. Frankly, getting the old and the new will be one of the easier challenges to meet, but there will be speed bumps. Many IT vendors, for instance, don’t always fully appreciate the security and safety issues faced by operational departments. Standards bodies and industry associations will also likely mandate things like “chain of custody” standards that will allow end-users to trace the full genealogy of any product they buy.Data sharing will also pave the way for things such as supply chain optimization, more efficient demand response networks and peak shaving programs that will make the smart city a reality.

Data sharing will become the norm

Historically, operational companies do not like to share their data. It can create security risks and, in some fields like oil and gas, operational data can provide the key to unlocking their competitive advantages.

Data sharing, however, has its advantages. Companies that sell or provide equipment can monitor their products for maintenance issues as long as they can get a feed into vibration or performance data. Ongoing monitoring like this can even serve as the basis of switching from purchasing equipment with capital to leasing it through “as a service contracts.”

Data sharing will also pave the way for things such as supply chain optimization, more efficient demand response networks and peak shaving programs that will make the smart city a reality. In California, for instance, around ten percent of the load in some regions comes from DERs, but the information about performance or current output can’t be readily accessed by the local utility to develop its own plans. The free flow of information would uncork huge benefits.

Digital communities will happen in stages. In the first stage, operational data will flow to other internal peers in IT or data science. Companies will then start to share it with trusted parties providing cloud analytics or other services. Ultimately, you’ll see it on a persistent level with data flowing to insurers and others. It will just take time.

But data ownership could be more difficult

Let’s say a utility has launched a program to optimize its operations by uploading and analyzing commercial and residential meter data on a nightly basis in conjunction with an analytics company. To further complicate the matter, imagine that some of the data being captured comes from behind the meter solar and storage deployed through power purchase agreements.

Who owns what? The utility might claim it owns it all because it gets generated on its network. Solar providers might complain that they have an ownership stake in their portion and should be compensated for giving access to it. At a minimum, they should get something in exchange, such as the historical usage data that utilities have been less inclined to give them in the pass.

Meanwhile, smart equipment vendors and cloud providers will likely rightly argue that, while utilities and power providers might have an interest in the raw data, they own the more valuable analyzed data, which didn’t exist until they applied algorithms to it.

And don’t expect consumers and businesses to stay out of the debate. “Why should we be paying for the data? Why shouldn’t OEMs pay us, the operators, for the data,” said Gavin Hall of Petronas Carigali, the Malaysian oil company at a recent event we held in London. “Perhaps we need to change the business model.”

The audience applauded.

Ultimately, we might have to take a page out of real estate property law to resolve these problems. In real estate, ownership is never absolute. When you buy a home, you generally buy it subject to easements and other restrictions. A lease in some jurisdictions is stronger than others.  If data weren’t valuable, this debate would not likely come up, but everyone at this point understands the value of what can be achieved through information. We have to come up with ways to use it that at least aspire to be fair and transparent.

Finally, get ready to explore new sources of revenue

Remember a few years ago when certain analysts were talking about the ‘utility death spiral?’ It’s turning out to be more like a reincarnation, with utilities developing new lines of business. Some of the more notable examples:

·       Tennessee’s EPB has become a broadband provider. “They are making a tonne of money off of fiber. They are using it to pay for their smart grid improvements,” said Neil Placer at EnerNex at a panel at POWERGEN.

·       Uniper, the German mega utility, has received the go-ahead from management to begin to explore interest in Tiresias, an in-house application for predictive maintenance. If successful, this would enable Uniper to become a software developer. PJM has stated it similarly wants to explore commercializing DIMA, a field maintenance application.

·       Financial Services. Blockchain is being discussed as a way to facilitate energy transactions between individuals. But some analysts and utilities are exploring ways in which utilities could serve as a neutral clearing house for transactions.

·       Tepco is leveraging its intellectual property developed over the years to deliver efficiency services to customers in the Philippines and other areas outside of its normal service territory.

We don’t know how these diversification efforts will work. What we do know is that utilities possess a great deal of knowledge and insight.

And ultimately, that might be more valuable than electrons. 

VIDEO: Utilities in the cloud: OSIsoft founder Pat Kennedy highlights the launch of OSIsoft Cloud Services, trends in the energy sector, and how big data can boost the integration of renewables.

 

 

Suggestions For You: