How Satellites and Big Data Are Predicting the Behavior of Hurricanes and Other Natural Disasters
On Friday afternoons, Caitlin Kontgis and some of the other scientists at Descartes Labs convene in their Los Alamos, New Mexico, office and get down to work on a grassroots project that’s not part of their jobs: watching hurricanes from above, and seeing if they can figure out what the storms will do.
They acquire data from GOES, the Geostationary Operational Environmental Satellite operated by NOAA and NASA, which records images of the Western Hemisphere every five minutes. That’s about how long it takes the team to process each image through a deep learning algorithm that detects the eye of a hurricane and centers the image processor over that. Then, they incorporate synthetic aperture data, which uses long-wave radar to see through clouds, and can discern water beneath based on reflectivity. That, in turn, can show almost real-time flooding, tracked over days, of cities in the path of hurricanes.
“The goal of these projects … is really to get data into the hands of first responders and people who are making decisions and can help,” says Kontgis, lead applied scientist at Descartes.
Hurricane Harvey, for example, unexpectedly flooded large parts of Houston despite abating wind speeds. That storm inspired Descartes scientists to build the program they now use, though they were too late to apply that data to recovery efforts. While Descartes Labs has been in touch with FEMA and other organizations, there’s no official use for the data they’re collating.
The work with hurricanes is not part of Descartes’ main business, which consists of using similar machine learning to assess food supply chains, real estate and more. For example, Descartes can look at satellite data of agriculture in Brazil, Argentina, and China, and make predictions on global corn yields and prices. Or it can assess construction rates and estimate land value. But the group can leverage the same technology to examine hurricanes and other natural disasters, and plans to incorporate additional information to the algorithm in the future, like hurricane size, wind speed, and even land elevation to better predict flooding.
Descartes is just one of numerous agencies, companies and research groups trying to leverage big data and machine learning on hurricane prediction, safety and awareness. Success could mean diminished damages — economic and human — in the face of worsening climate-induced storms, or at least increased options to mitigate those damages.
Predicting where a hurricane will go is a well-established perspective, says Amy McGovern, a professor of computer science at the University of Oklahoma. McGovern studies the use of AI in decision making about thunderstorms and tornadoes, but not hurricanes, for that reason. But she says there are still a lot of factors in hurricanes that are difficult to predict. Where they’ll land may be predictable, but what will happen once they gets there is another story; hurricanes are well known for fizzling out or ramping up just prior to landfall.
Even with neural networks, large-scale models all make use of certain assumptions, thanks to a finite amount of data they can incorporate and a nearly infinite number of potential types of input. “This makes it all a challenge for AI,” says McGovern. “The models are definitely not perfect. The models are all at different scales, They’re available at different time resolutions. They all have different biases. Another challenge is just the sheer overwhelming amount of data.”
That’s one of the reasons so many scientists are looking to AI to help understand all that data. Even NOAA is getting on board. They’re the ones who operate the GOES satellites, so they’re inundated with data too.
So far, NOAA scientists are using deep learning as a way to understand what data they can obtain from their images, especially now that the new GOES-16 can sense 16 different spectral bands, each providing a different glimpse into weather patterns, resulting in an order of magnitude more data than the previous satellite. “The processing of the satellite data can be significantly faster when you apply deep learning to it,” says Jebb Stewart, informatics and visualization chief at NOAA. “It allows us to look at it. There’s a fire hose of information… when the model is creating these forecasts, we have a different type of information problem, being able to process that to make sense of it for forecasts.”
NOAA is training its computers to pick out hurricanes from its satellite imagery, and eventually will combine that with other layers of data to improve probabilistic forecasts, which will help the Navy, commercial shipping companies, oil rigs and many other industries make better decisions about their operations.
NASA, too, is using deep learning, to estimate the real-time intensity of tropical storms, developing algorithmic rules that recognize patterns in the visible and infrared spectrums. The agency’s web-based tool lets users see images and wind speed predictions for live and historic hurricanes based on GOES data.
Once we can expect computers to reliably spot hurricanes, we need a way to translate that to something people can understand. There’s a lot more information available than just wind speed, and making sense of it can help us understand all the other ways hurricanes affect communities. Hussam Mahmoud, associate professor of civil and environmental engineering at Colorado State University, has looked extensively at the factors that make some hurricanes more disastrous than others. Primary among them, he says, are where those storms make landfall, and what, or who, is waiting for them when they get there. It’s not surprising to suggest that a hurricane that strikes a city will do more damage than one that hits an unoccupied coast, but one that hits an area prepared with sea walls and other mitigating factors will have a diminished impact as well.
Once you know what sort of damage to expect, you can be better prepared for the challenges to cities, like crowding in hospitals and school shutdowns, and you can be more certain whether evacuation is necessary. But then there’s the problem of communication: Currently, hurricanes are described by their wind speed, placed in categories from 1 through 5. But wind speed is only one predictor of damage. Mahmoud and his collaborators published a study last year in Frontiers in Built Environment about an assessment called the Hurricane Impact Level.
“We wanted to do something where we can communicate the risk in a better way, that includes the different possibilities that this hazard might bring,” says Mahmoud. “The storm surge would be very important, how much precipitation you have is very important, and how much wind speed.”
The project incorporates data from recent storms — wind speed, storm surge and precipitation, but also location and population — and applies a neural network to them. Then it can train itself, estimating, for example, if a hurricane should make landfall in X location, with wind speed Y, storm surge Z, etc., the damage would probably be of a particular level, expressed in economic cost. It compares inputs from NOAA records, census data and other sources from real storms, and gives a damage level that is similar to what occurred in those storms. Mahmoud’s team tried it for real, and over the last two years, the model has given accurate estimates for hurricanes that made landfall.
“If we can do that, maybe then we can, first of all, understand the magnitude of the damage that we’re about to experience because of a hurricane, and … use it to issue evacuation orders, which have been one of the main issues with hurricane mitigation and response,” says Mahmoud.
Mahmoud’s proposed system hasn’t been rolled out yet, but he’s in talks with The Weather Channel, which he calls early stage, but promising.
The Weather Company (The Weather Channel’s parent company) is already using its subsidiary IBM’s PAIRS Geoscope big data platform to forecast power outages and thus prepare better disaster response in the wake of hurricanes. The inputs for the system come not just from weather satellites, but from utility network models and power outage history. These predictions, too, will benefit from adding more and more sources of data, including soil moisture, which can help predict tree falls.
The amount of data available is growing extremely fast, and so is our ability to process it, an arms race pointing to a future of expanding accuracy and probabilistic hurricane forecasting that will help storm preparedness around the world.
#Alder ,Mountaineer,andMosesFiresFire #Alder ,Mountaineer,andMosesFires; lat, lon: 36.220, -118.620 #EdenFire #Eden ; lat, lon: 36.410, -118.740; 1718 acres #CAfire pic.twitter.com/B2ZwfmxJiv
— Wildfire Signal (@wildfiresignal) November 27, 2018
Descartes Labs has another project in the works, too, unrelated to hurricanes except that it leverages similar technology on another natural disaster — wildfires. When California’s Camp Fire broke out in early November, a twitter bot called @wildfiresignal sprang to life. Built by the same team from Descartes, @wildfiresignal prowls data every six hours from GOES-16 for smoke plumes and tweets side-by-side optical and infrared images of the fire. Infrared information can show the heat of the fire, which can help visualize its location just as the blaze is beginning, or at night when smoke is hard to see. This could help firefighters or residents plan escape routes as the fire approaches them, but, as with the hurricane project, collaborations with firefighters or national forests are preliminary.
“If we could have an alert system globally where you knew when a fire started within ten minutes after it started, that would be spectacular,” says Descartes CEO Mark Johnson. “We’re still probably a ways away from that, but that’s the ultimate goal.”