Can Digital Twins Help Modernize Electric Grids?
Recent failures of electric grids in Texas and California have highlighted the difficulties that energy companies and regulators face in ensuring the delivery of reliable and affordable power while simultaneously switching to less-polluting energy sources. One possible solution is digital twin software, which use AI technology to model physical assets that live in the real world. But can digital twins deliver the goods when it comes to next-generation grids that mix renewables and traditional energy sources? The answer from GE Digital’s Colin Parris may surprise you.
Parris wears two hats when it comes to digital twins. On the one hand, Parris, who’s the senior vice president and chief technology officer at GE Digital, oversees the infusion of digital twin and AI technology into the industrial products that GE makes, such as jet engines and windmills.
“We have steam turbines that are as old as 82 years,” Parris says. “Because if you replace the parts and you keep it going, nobody cares how it looks. It just delivers value.”
On the other hand (or hat), Parris is involved in taking that digital twin and AI technology that GE Digital develops for use within GE, and packaging and selling that technology to other companies to help them drive more efficiency and value out of their investments in physical infrastructure.
“Inserting AI into those streams is the thing I spend most of my time here doing,” says Parris, who has a PhD in electrical engineering from UC Berkeley and joined GE about six years ago after two decades with IBM. “I build products that GE Digital can then sell to other companies not in these industries, and they can do the very same thing: Use the AI to operate and do these services to capture value.”
General Electric, which was co-founded by Thomas Edison way back in 1892, is practically synonymous with electricity in the United States. So it should come as no surprise that GE Digital, which is one of 10 subsidiaries of GE, is heavily involved in electric grids, although not necessarily at the digital twin level.
According to Parris, 40% of the world’s electrons touch GE Digital software in some way, shape, or form. That includes GE Digital software for managing the transmission grid (i.e. high-voltage lines for long-distance transmission), software for managing the distribution grid (i.e. lower voltage lines handling the last-mile connections to homes and businesses), and software for managing electricity markets.
Considering the familiarity that the GE company has with electric grids and digital twin technology, one might think that GE Digital is better positioned than any other company to bring those two things together. And yet, that’s not happening at this point, at least at the macro level, according to Parris.
When asked about the recent grid failure in Texas and whether a digital twin could have better prepared the Texas grid operators to prepare for the cold snap that caused blackouts to more than 4 million homes in February, Parris said that digital twins could have been used to either design a better system or used in a real-time manner to operate the grid more efficiently.
However, the application of digital twin technology to large, complex systems like an electric grid is not as common as you might think, Parris said.
Complex System of Systems
“Even though I talk about the twin of the entire network, most people don’t do that,” he said. “Most people buy twins for where they have problems, or they’ll have a twin on one thing they’re trying to optimize. Very few people are trying to use twins, because it’s a new technology, to optimize systems of systems.”
That’s not to say that digital twins cannot be used to optimize these complex system of systems, Parris said. Such an application can be built. But someone attempting to build such a digital twin application runs into some practical considerations for which there are no easy solutions.
“You can do a system of systems,” he told Datanami. “[But] the two things you’ve got to ask yourself is, have I collected data to do it, first of all, and can that data actually be merged so I can aggregate it so the twin can use it? In many cases people will say, Yeah, I’ve got data on turbines. I’ve got a lot of data on combined cycle plants. Can you integrate that so that a twin can pull data to do what it needs to do? And in many cases, the answer is no, I have not.”
Just getting one’s hands on the right data is a big challenge, because the raw data required to feed the AI model underlying the digital twin is spread across multiple systems, Parris said. It could reside in the manufacturing execution system (MES) application, the enterprise resource planning (ERP) application, or a separate system designed for managing repairs.
The cost of building such a digital twin model, and the perceived payback, did not match up, according to Parris. But that may change.
“I’ll bet you they try to do it now, because the bill is $16 billion. Once the bill becomes $16 billion, now you can turn back around and say, guess what? We think we can pull out half a billion dollars to figure out how we don’t get another $16b bet.”
The actual cost in Texas was higher than that. CBS News reported that the damage could exceed $200 billion, which is more than the state government’s annual budget, and more than 110 people are said to have died, mostly from hypothermia. But Parris’s point about addressing costs is still valid.
Renewables in the Mix
The challenges associated with modeling a grid are more challenging than what the recent failures in Texas would have us believe. In Texas, the grid was largely dependent on traditional energy sources, such as coal, natural gas, and nuclear, which failed in the single-digit temperatures to varying degrees. Despite significant investments in renewables–primarily windmills–the grid operator in Texas was not relying on renewables to get through that February cold snap.
That makes last September’s grid failure in California more apropos to the current challenge. In California, rolling blackouts were ordered to prevent wider blackouts during a heatwave that sent temperatures soaring past 110 degrees. A dome of high-pressure over much of the West not only led to calm air, which significantly cut the output from wind power, but it also maxed out other states generating capacity, eliminating the ability to buy out-of-state power (which was cited as a major fault in Texas’ grid design). When a large solar plant went offline, the reserve supplies in California were insufficient to meet demand, which led to blackouts.
The challenge for California–and the opportunity for digital twins–revolve around how the state navigates its energy mix for the future. California’s legislature has mandated that the state’s electric grid be 100% carbon-free energy by 2045. The state is actively moving toward that goal, and the looming 2025 retirement of state’s last nuclear power plant, Diablo Canyon, which supplies about 10% of the state’s power currently, will move be a big step toward achieving that goal.
Digital twins could help guide decision-making as California completes its transition to 100% renewables, according to Parris, who points out that GE Digital is working with Southern California Edison, one of the state’s three largest investor-owned utility, to help model its operations. However, the mix of renewables in in the Golden State, not to mention Gov. Gavin Newsom’s ban on gasoline- and diesel-powered cars starting in 2035, will make it much harder to find a balance than in the Lone Star State.
“It’s not just the heating [and cooling] of the buildings, but the cars,” Parris says. “It will be more distributed energy resources, like EVs [electric vehicles]. How do I bring them in? They add another complexity, because I don’t know when you’re going to charge your EV. I don’t know how much you’re going to use your car.”
Backers of renewable energy are banking on large battery plants being able to handle short-term spikes in energy demand that have traditionally been handled by natural gas-powered “peaker” plants in California. But grid-scale battery technology is still unproven, and it also introduces more variables into the grid equation that will have to be accounted for.
“How long does that battery live [is] based on how often you charge and discharge it, so the life of the battery is a factor,” Parris says. “So there’s complexity of temperature. There’s the complexity of the maintenance cost. There’s the complexity of the longevity of the battery. All of this has to be factored in and that’s where AI comes into play.
Today’s grid operators likely have a physics model of their grid that, to some extent, can replicate the physical forces at play in their electrical system. But these models lack the additional data inputs that would be required to model what the grid actually faces on a day-to-day basis, Parris says.
“The digital twin we use at GE is a combination of physics model and actual data,” Parris says. “The real world is changing so fast that what you’ve got to do is use real world data to tune some of the parameters in that physics model so you get as close to reality as possible.”
In addition to physics events, like the rate of degradation of a combined-cycle plant, the digital twin must also incorporate other data points, like the weather. Perhaps the area has suffered from intense Santa Ana wind events that have spread contaminants across the area. How does that impact the grid?
“All of that needs to be modeled as well,” Parris says. “That’s why your digital twin has to increase fidelity by adding AI and that kind of data to the physics data to get better and better. That, I know they don’t have.”
Recent failures of electric grids in Texas and California have highlighted the difficulties that energy companies and regulators face in ensuring the delivery of reliable and affordable power while simultaneously switching to less-polluting energy sources. Read more…