Among the things that the weather nitwits on television (but I repeat myself) like to highlight are how cold or how hot it "feels like" based on heat indices and wind chills, rather than the actual air temperature.
Now, of course wind feels colder that still air, and humid heat feels hotter that dry air. The idea that they can be quantified, let alone to the exact, er, degree claimed by the Meteorological Muppets on the air, though, is not entirely justified. The original experiment to measure how much colder it is when the wind blows was done in Antarctica in 1945. It consisted of measuring how quickly a container of water froze when it was exposed to wind than when it was not.
Things cool not by gaining cold but by losing heat. That heat often "hovers" near the object which has recently lost it and slows the rate at which surrounding cold air continues to absorb it. But wind moves the warmed air along and exposes the object to the cold more directly. The original experiments were not the best measures of the actual rate of cooling, though, so others were done and from them we gain all of the "feels like" language, which is a silly standard by which to judge. Some people "feel like" a room is "freezing" when it's 65 degrees. Plus, the difference between, say, zero degrees and minus 15 degrees is technically known as "too frickin' cold either way so who cares," so who cares?
But it gives the channel chatterers another reason to say that you should remember to bring a coat with you if you leave the house. As if the single digit temperature and your mom weren't already telling you that same thing.
No comments:
Post a Comment