A new look at radiational cooling

If long-wave radiation can easily escape the area, we could see net cooling even if the sun is shining

I came across a weather article last week that was a purely academic discussion about radiational cooling. It was one of those articles that at first glance seemed to be purely a discussion for true weather geeks, meteorologists and university professors, with very little if anything to do with an everyday understanding of the weather, at least for us lay people.

I read the article and tried my best to understand it and then moved on, putting the information into the back of my mind, but as the week went on I found I had an “aha” moment when some of the information in the article clicked into something I have noticed happening with our weather.

I’ll attempt to summarize what the article was discussing, hopefully without getting too boring or technical. Then I’ll go into my “aha” moment and see if you too have noticed this, and I’ll leave it to you to decide if this is a possible explanation.

The focus of the article on radiational cooling was that there are apparently two definitions of just what radiational cooling is, and then of course, which definition is the correct one. The first definition, the most common one, is more than likely what most people would use. It states radiational cooling occurs when an object’s temperature decreases. To me and most people this definition makes perfect sense. If an object is cooling and its temperature is decreasing that means it is giving off or radiating its heat into the surrounding environment, thus the term radiational cooling.

So now you’re probably thinking: If the definition of radiational cooling is so simple and intuitive, then how can there be a second definition, and why argue about it? The second definition, I believe, has merit, but is a little more technical and therefore difficult to understand. Before I try to explain this second definition I have to first discuss the differences between shortwave radiation and long-wave radiation. When the sun shines on an object, the energy coming from the sun is referred to as solar radiation or shortwave radiation. The length of the wavelength has to do with the amount of energy available: the shorter the wavelength, the higher the energy level. When objects on Earth give off heat, they give it off in the form of long-wave radiation. The second definition of radiational cooling ties directly into this. It states that if the amount of long-wave radiation entering a region is less than the amount of long-wave radiation moving out of a region, then the net result is radiational cooling.

In the first definition, an object is warmed up by the incoming shortwave radiation and as long as it is increasing in temperature there is no radiational cooling. In the second definition, an object or region can be warming up, but still be cooling — sounds a little strange, doesn’t it? With this definition it sounds like there might never be radiational warming, only cooling. What we have to remember is that warm air entering or flowing into a region will warm that region up; however, that warming is not coming from shortwave radiation but rather from the heat given off by the warmer air, which is long-wave radiation. So we can have times when there will be radiational warming and not just cooling.

Re-radiating

Now where does my “aha” moment fit into this? Have you ever noticed during the spring melt that on some days the temperature will be above 0 C and the snow will be melting everywhere, even in the shade? On other days, even though the temperature is the same, melting only seems to occur in the sunny areas and not in the shade? Now I know there are a number of factors that can influence this, such as overnight temperatures, wind speed et cetera, but I have noticed that even when these factors have not really come into play you still see this happening. The cause, I believe, is radiational cooling.

An interesting thing about snow and ice is that they are very good at absorbing long-wave radiation. That is why you may have heard the phrase “Fog eats snow.” It’s not that the fog actually does something to the snow; rather, the fog is absorbing, then re-radiating, long-wave radiation back to the ground — and this long-wave radiation is helping to melt the snow. If atmospheric conditions are such that long-wave radiation can easily escape or leave an area, that area will experience radiational cooling even if the sun is shining and the temperatures are warming up. We can see this in the snow melt, usually when the temperatures are in the +1 to +7 C range. Even though the air is warm enough to melt the snow, in the shade very little if any will often melt because that snow is cooling radiationally — something to think about.

About the author

Co-operator contributor

Daniel Bezte

Daniel Bezte is a teacher by profession with a BA (Hon.) in geography, specializing in climatology, from the U of W. He operates a computerized weather station near Birds Hill Park.

Comments

explore

Stories from our other publications