News

How Wind-Chill Is Actually Calculated Will Make You Question Everything This Winter

Spencer Platt/Getty Images News/Getty Images

This week, the Midwestern United States is being hit by extreme cold, with meteorologists reporting that some cities could see wind-chills of negative 50. Amidst so much chatter about this extreme weather, you may be wondering what exactly the difference between wind-chill and temperature is, since both measures are frequently referenced on weather forecasts. As it turns out, the difference between the two concepts is perhaps not what you might expect.

As Vox reported, meteorologists commonly use wind-chill as a measure to describe what it feels like outside when the wind is blowing. For example, weather forecasters will state the actual temperature and then share a revised temperature (usually a lower number than the actual temperature) that includes the wind-chill. However, as Vox pointed out, this practice is somewhat misleading and doesn't actually reflect what the term "wind-chill" means.

The outlet reported that wind-chill was a metric originally developed in the 1940s to assess how quickly the wind causes objects to lose heat. The modern wind-chill index, which has been revised but is based on this initial principal, is actually focused on someone's risk of developing frostbite, not on how the temperature feels outside, Vox noted. "It [wind-chill] was developed solely to assess the risk of frostbite on unclothed parts of the body," Krzysztof Blazejczyk, a Polish researcher who studies human thermodynamics, told Vox.

For example, if it's 31°F outside and the wind-chill is 26°F, this simply means that someone will get frostbite on their unclothed skin as rapidly as they would if it were actually 26°F outside and there was no wind blowing. Simply put, the wind-chill tells you how fast your skin will reach air temperature, but your skin will never actually get colder than the air temperate, as Slate noted. Crucially, this means that, in order to actually get frostbite, the actual air temperature — not the wind-chill — has to be 32°F (freezing) or below, Slate pointed out.

As Mental Floss noted, critics of the wind-chill metric emphasize that it causes confusion because it's heavily used and often misinterpreted by some meteorologists and members of the public. Moreover, as the outlet noted, some experts believe that the wind-chill measure is rooted in unrealistic assumptions that then make it less accurate. As Ethan Trex of Mental Floss described, calculations for the modern wind chill index assume that "your exposed face is roughly five feet off the ground, it's night, and you're walking directly into the wind in an open field at a clip of about 3 mph" — something which represents a pretty uncommon scenario.

Vox noted that, because of these questionable assumptions and its confusing interpretation, some experts have suggested doing away with the wind-chill in favor of other measures that actually reflect what the weather feels like outside, like the Universal Thermal Climate Index (UTCI). However, as Mental Floss pointed out, the fundamental notion behind wind-chill is that strong wind causes skin to cool off faster and, hence, become prone to frostbite more quickly. Therefore, wind-chill can still be a useful metric to assess your frostbite risk when the outside temperature is below freezing.

For those enduring very cold temperatures this week, hopefully taking a closer look at the difference between wind-chill and actual temperature will offer further insight into meteorologists' reports about this extreme cold spell.