Wednesday, September 7, 2022

Evaporative Cooling Vs. Compressor-Driven A/C

In this post I'm going to explain more concerning what I've learned these two types of air conditioning.  In case new readers are wondering why I'm interested in evaporative cooling, it's because the technology is pretty easy to build yourself -- but with that, there are definite limitations that come along with it.

The psychrometric chart shown below has been marked to illustrate the two different kinds of cooling.  I'll then discuss some interesting differences between them.



I've drawn a solid line and a dotted line.  They both start at the same point, 32.2C (approximately 90F) and 36% relative humidity (RH).  That was the outside afternoon temperature at our house a week or two ago.   The solid line is drawn along a constant-enthalpy line, which just means that the total energy of the system remains constant.  Note that the relative humidity increases and so does the humidity ratio (basically, the amount of water in the air air, shown on the right side of the chart).  This shows what's going on when evaporative cooling is taking place.  You might think that this mechanism can't occur without a change in energy because the air is being cooled:  but that is balanced by the energy carried away by the water as it changes from a liquid to a gas.  To distinguish the two "forms" of heat, the energy contained by the air (oxygen, nitrogen and a small amount of carbon dioxide) is called "sensible heat".  Possibly because it can be "sensed" by a thermometer?  I haven't investigated the origins of the name so that is just a guess.  And the energy contained by the water vapor is called "latent heat", because it only plays a role when the water either evaporates or condenses.  Latent heat is a big deal in the A/C world because in humid climates it can be a substantial contributor to the energy (as in, coming out of a wall socket) needed to cool and condition air.

The humidity ratio for evaporative cooling increases because evaporating water is being used to cool the air, so the amount of water in the air increases.  So the latent heat increases, balancing the sensible heat drawn out of the air:  so the overal energy (enthalpy) remains constant.

The problem with so-called "swamp coolers" is that they are not very effective in humid climates, for two reasons.  As the humidity increases, the wet-bulb temperature increases so the chiller can't deliver air that's much colder than what entered it.  And the second problem is that the chiller increases the relative humidity of the air that exits it.  This reduces our body's ability to cool itself via evaporative cooling, so our perception of comfort is reduced

Now lets move on to the solid horizontal line.  That is what is going on when conventional compressor-driven cooling occurs.  The line follows a constant-humidity line because the amount of water in the air doesn't change.  Since there is no phase change, at least down to the dew point, the total energy in the air decreases:  the enthalpy decreases.  However, closer examination of the line shows that the relative humidity increases.  This is because cooler air has a reduced capacity to hold water vapor.  When the temperature reaches the dew point (at 15C/59F), the relative humidity reaches 100% and water starts to condense.   It takes a LOT of energy to condense water so once that happens it suddenly takes a bigger A/C unit to get the temperature to decrease.  The other factor that comes into play is our perception of comfort when humid air is cooled.  59F is pretty chilly, so let's say we just cool the air down to 68F (20C).  Our chart indicates that the air's relative humidity now is about 75%.  This is pretty humid so we don't feel all that comfortable -- our body's ability to cool itself is reduced because we can't cool ourselves as effectively due to _our_ evaporative cooling.  At the dew point our body can't cool itself at all via sweating so 59F would actually feel very uncomfortable.  The other downside to high humidity is that it promotes the growth of mold and mildew, steel parts rust and so on.

To improve the comfort level, most A/C systems deliberately cool the air to the dew point in order to force it to condense.  The cool air exiting the A/C unit has a lower relative humidity due to the condensation.  But now we have the reverse problem -- the air feels TOO cold for comfort.  How many of us have had the misfortune to be seated at a restaurant directly below an air conditioner vent?  Feels pretty cold, huh.  Well, it actually could be worse because commercial systems actually use extra energy to deliberately WARM that cold air back up some.  More sophisitcated A/C systems can recycle the heat they extract from the incoming air via a heat exchanger so the energy cost is lower:  but the cost of such an A/C system is higher.  

Here's a factoid.  An A/C system that returns all the heat energy back to the interior space it's in might seem ridiculous because it doesn't cool the room -- but it DOES reduce the relative humidity.  This type of system is called a dehumidifier.

So on the one hand we have evaporative cooling systems that work well in very dry climates but become less and less effective as humidity increases.  Unfortunately, in many parts of the world high temperatures are accompanied by high humidity so they aren't nearly as prevalent as compressor-driven A/C systems.  

In contrast, compressor based A/C can dry the air too much if it's used in dry areas of the world; and in humid areas a large percentage of the energy they consume is just used to pull water out of the air.  In hot humid locations, the energy consumed by A/C can be a large percentage of the total energy consumption of a household.

Both systems have their advantages and disadvantages, so it's no surprise that there still is considerable research and development going on to mitigate the disadvantages.  I'll go over some of those efforts in furture blog posts on the subject.

No comments:

Post a Comment