GET BREAKING NEWS IN YOUR BROWSER. CLICK HERE TO TURN ON NOTIFICATIONS.

X

Boulder researcher probes facial recognition technology as hail forecasting tool

NCAR machine learning scientist leads ongoing effort

Field manager Kyle Johnson looks over hundreds of hail damaged crops, including tomatoes, zucchini and cucumbers, on June 20 at Kilt Farm between Longmont and Niwot.
PUBLISHED: | UPDATED:

People since the dawn of time have been attempting to gauge the potential of incoming weather simply by looking at storm clouds gathering on the horizon. Far more precision is now being brought to the task.

Thanks to research led by a scientist at Boulder’s National Center for Atmospheric Research, better knowledge of the potential of hail storms may soon be within forecasters’ ability through putting to use the same artificial intelligence technology typically used in facial recognition systems.

Rather than focusing on humans’ faces, researchers are developing a deep learning model known as a convolutional neural network, capable of recognizing features of specific storms that determine both the formation of hailstones and their size, which have historically been otherwise difficult to gauge.

The results of the research team, which is led by NCAR machine learning scientist David John Gagne, have been published in the American Meteorological Society’s Monthly Weather Review. They underscore the importance of considering a storm’s entire structure, which as been previously difficult to do, through traditional hail forecasting.

“One of the things we got out of this project is, the shape of the storm is really important as to whether it can produce large hail or small hail,” Gagne said on Wednesday.

“Within the class of supercells, ones with a more west-to-east orientation tend to be more large hail producers than ones with a more north-to-south” orientation, he said.

“They both can (produce large hail), but one is more likely to. And that kind of information can help forecasters issue a more strongly worded warning. If you look at the computer simulations for a given day, you will see more (potential from) these kinds of storms, and forecasters will issue a higher probability of hail for a given day. We can also reduce false alarms, and not be warning people, who don’t need to be warned on a given day.”

AI’s ‘engine’

For the new study, Gagne employed a type of machine learning model designed to analyze visual images. According to a news release, he trained the model using images of simulated storms, along with information about temperature, pressure, wind speed, and direction as inputs and simulations of hail resulting from those conditions as outputs. The weather simulations were created using the NCAR-based Weather Research and Forecasting model.

Then, the machine learning model determined which features of the storm are correlated with whether or not it hails and the size of  hailstones. Once the model was trained and then demonstrated that it could make successful predictions, Gagne examined which aspects of the storm the model’s neural network thought were the most important.

He used a technique that essentially ran the model backward to pinpoint the combination of storm characteristics that would need to come together to give the highest probability of severe hail

Gagne, who is working with several fellow researchers both at NCAR and the University of Oklahoma, offered an analogy to explain the concept of machine learning.

“AI is like a whole car, the whole system,” he said. “Machine learning is sort of the engine that drives the car, and the data are the fuel that you put into the engine, that the engine turns into whatever decision you need to make, or outcome you want — which is pushing the car forward.”

The research, building on Gagne’s doctoral work several years ago at Oklahoma, was supported by the National Science Foundation, which is NCAR’s sponsor.

“Hail — particularly large hail — can have significant economic impacts on agriculture and property,” Nick Anderson, an NSF program officer, said in a statement. “Using these deep learning tools in unique ways will provide additional insight into the conditions that favor large hail, improving model predictions. This is a creative, and very useful, merger of scientific disciplines.”

Billions in damage

Better forecasting techniques for severe hail events could be of great interest to homeowners, business owners, insurers and more up and down Colorado’s Front Range and beyond, as damaging storms have claimed billions in dollars worth of damage in just the past few years alone.

According to the Rocky Mountain Insurance Information Association, a single storm in the Denver Metro area on May 8, 2017, caused $1.4 billion in damage,

The Front Range is in the heart of what the association calls “Hail Alley,” with the highest frequency of large hail in North America and most of the world, where residents can count on three or four catastrophic hailstorms  — defined as causing at least $25 million in insured damage — every year. In the past 10 years, according to the association, hailstorms have caused more than $5 billion in insured damage in Colorado.

Gagne said the high dollar figures associated with Front Range storms is likely more a factor of there being more valuable property to be impacted by a storm — be it larger and more expensive homes, the mushrooming number of cars on our highways — than a case of the storms being more severe.

And a bad storm, he noted, might not be the most damaging, if it strikes an unpopulated area. For example, Colorado on Aug. 13 reported its largest hailstone on record, a minimum 4.83-inch-diameter specimen (meaning it was likely larger when it hit the ground) was recorded just west of Burlington, in tiny Bethune — which with a population of about 230, would not report the same level of damage that would likely occur from the same storm hitting downtown Boulder or Longmont.

“In terms of climate change impacts on hail, there are studies that have been looking at that problem, and it’s a hard signal to tease out from other factors,” Gagne said. “From studies looking at what will happen over the next 100 years, the best we can see is that we’ll probably have fewer hail storms, but the ones we do have will be more severe.”

Gagne said one byproduct of his research is that NCAR has secured a $731,000 grant from the National Weather Service to develop a machine learning storm-type detector, in order to help forecasters better determine where a supercell storm — a likely generator of hail — is likely to develop, versus a squall-line storm, which is less likely to bring hail.

“I’m also working with other colleagues on applying similar kinds of techniques to hurricane predictions, tornado prediction, and a better understanding of thunderstorms at a time scale from now-casting, to seasonal and sub-seasonal prediction,” Gagne said.

blog comments powered by Disqus