To find out more about the podcast go to How scientists predict big winter storms.
Below is a short summary and detailed review of this podcast written by FutureFactual:
How Weather Models Forecast Winter Storms and Why Data Quality Matters
Shortwave dissects how modern weather models forecast a massive winter storm and why the data behind forecasts is as crucial as the models themselves. The show explains how computer simulations use a mix of models, weights, and probabilities to predict snow and wind days in advance, and why lead time has grown thanks to global observations of the Earth system. It also explores where the data comes from—weather stations, weather balloons, ships, satellites, and radar—and why continuous, decades-long data records are essential to understanding extremes. The episode warns that proposed budget and staff cuts to NOAA, NASA, and other agencies could undermine the data backbone of forecasting, potentially reducing future lead times and forecast accuracy.
Overview: The winter storm and forecast lead time
In this episode of Shortwave, Regina Barber and climate reporter Rebecca Hersher explore a winter storm that stretches across a large portion of the United States and the days-long lead time that modern forecasts provided. The hosts highlight how people were warned days in advance, a stark contrast to past decades when such warnings weren't as feasible. The discussion centers on the idea that the ability to forecast ahead is not simply a matter of better software, but of the data that feeds the models and the infrastructure that stores and distributes it. The episode also introduces the tension between scientific capability and policy, noting that data and forecasting depend on sustained public funding for agencies like NOAA and NASA. The result is a narrative about trust, preparation, and the evolving science of forecasting.
"The fact that we're talking about an event in New York City where I am, right, that's happening in a few days from now, you know, that wasn't something we could do. 50 years ago." - Regina Barber
How Weather Models Predict the Weather
The program breaks down the core idea behind weather forecasting: multiple computer models simulate the atmosphere, each with strengths and weaknesses across different scales and weather phenomena. The ensemble approach—averaging or weighting across models—helps meteorologists estimate likely outcomes and their probabilities. The more accurate the models and the data feeding them, the more reliable the forecasts. Hersher explains that advances in computer modeling, guided by vast data streams from around the world, enable predictions days in advance that simply weren't possible half a century ago. The segment also touches on what makes a model effective: sufficiently fine spatial resolution, accurate physics, and robust data assimilation that reconciles model outputs with real-world observations.
"The better the models are, the better the weather forecast is going to be." - Rebecca Hersher
Data Requirements: Plentiful, Global, Continuous
A central point of the discussion is the aphorism garbage in, garbage out. The reliability of forecasts hinges on high-quality data: plentiful measurements from many sources, coverage across space and time, and continuous records that extend over decades to reveal patterns in extreme weather. The atmosphere’s complexity demands data from weather stations, balloons, aircraft, ships, satellites, and radar. The episode traces how Earth-observing satellites, beginning in the late 1970s and continuing today, have built the backbone of modern forecasting. The data needs are not just about quantity; they require global geographic reach and temporal continuity to detect trends and extreme events. Hersher notes that the quality of inputs directly shapes the credibility of the outputs—forecast confidence grows when the data foundation is solid.
"garbage in, garbage out, important for our health. Also true, I think of many fields of science, especially things where you have a large number of observations." - Rebecca Hersher
Data Infrastructure and Policy Headwinds
The episode then pivots to policy and funding, describing how much of this critical data lives in publicly funded systems managed by government agencies. The narrative addresses budget and staffing pressures that could threaten data collection, balloon launches,NOAA's data centers, NASA's satellite programs, and even the National Center for Atmospheric Research. It explains that a reduction in funding or staff could disrupt the data streams feeding weather models, which in turn affects forecasts and public preparedness. The discussion implies that data availability is not simply a technical concern but a matter of national resilience, especially as the climate becomes more volatile and extreme weather more frequent.
"as the weather gets more and more extreme, it will be difficult to keep up this level of like Accurate early forecast if scientists and data are stymied in the ways that they could be if all of these cuts were to go through." - Regina Barber
Implications for the Future
Concluding, the speakers warn that without sustained investment in data collection and model development, future forecasts could lose their lead time and accuracy. The conversation points to a broader question: how to balance budget realities with the need for rigorous, continuous Earth observation that supports weather prediction and climate research. The episode ends with an invitation to regulators, scientists, and the public to maintain a commitment to science-informed decision making, especially as communities plan for ever more variable weather patterns.
"I would say that as the weather gets more and more extreme, it will be difficult to keep up this level of like Accurate early forecast if scientists and data are stymied in the ways that they could be if all of these cuts were to go through." - Regina Barber

