Thursday, September 15, 2022

Not your usual dehumidifier, an Addendum

I ended my previous post discussing two different approaches that combine an M-Cycle(-ish) style chiller with a liquid desiccant dehumidification system.  One thing I failed to mention regarding the "bootstrap" approach, where the input air to the M-Cycle-Like (hereinafter called the MCL???) is dehumidified using LD, is that, if it works, it should output water that is chilled below the ambient-air dewpoint.  Simply because the water content of the input air is lower.  It remains to be seen if the end result justifies the added complexity of such a system.

The extra-cold water coming out of such a chiller might extract water from interior air to help dehumidify it -- but only if the inside heat exchanger is allowed to cool below the dewpoint.  Since we're running warm interior air through the HX I wouldn't count on it but, since up to this posting I haven't done anything other than make and characterize a plain-vanilla "swamp cooler" style chiller, who knows for sure.  I don't.  

I sort of want it to get cold enough, but don't at the same time, because if it DOES get cold enough to condense water I will need to add a way to take care of the water, rather than let it drip on our expensive wood floors!

Monday, September 12, 2022

Not Your Usual Dehumidifier

 Early in my quest for a DIY A/C system that might actually work in our (often) humid summers I came across a couple of youtube videos produced by Tech Ingredients that led me down an interesting path.

The first one, link here, introduced me to the idea of liquid desiccants.  It used liquid desiccant (LD for short) to pre-dry air that is cooled by flowing through an evaporative cooler.  It was fairly complex, using a second evaporative cooler to cool down the hot and regenerated liquid desiccant (more on this later in my post).  The second one, link here, is a system they built that was (hopefully) sized for a real-world application but didn't work all that well, possibly due to poor efficiency of their chilling tower and desiccant-solution tower.  I think that their spray head scheme didn't work too well -- it's likely that most of the spray quickly wound up flowing down the inner walls of the tube.  The laminar flow of the counter-flowing air then formed a "dead layer" that prevented good contact between the bulk of the air and the water or desiccant.  There are devices called "turbulators" that break up laminar flow into more-turbulent flow that might improve the performance of those towers.

So, what is liquid desiccant (LD) and why is it particularly useful for drying air for A/C purposes?

Folks should be familiar with one-shot desiccants like the silica gel packets found in prepackaged food, vitamins and other food supplements, or products like "Dry-Z-Air", used to capture moisture in locations like RVs, closets etc.  In the latter case, it actually uses the same chemical that is often used in LD applications -- calcium chloride.  I should add that all these desiccants can be regenerated by getting them hot enough to release the water they have absorbed.  I have purchased silica gel beads that actually have an indicator in them to show when they are exhausted and need to be baked so they can be re-used.  And I've seen at least one blog post where someone did something similar with calcium chloride, but it was a pretty dangerous process -- it's necessary to get CaCl pretty hot, and at that temperature it is very corrosive.

There are other solid desiccants like zeolites, some types of clay, molecular sieves etc.  They HAVE been used to perform continuous dehumidification by putting them in a rotating wheel or drum configuration.  One side of the drum is heated and air is passed through it.  The high temperature plus air flow pull the water out of the desiccant.  Then the wheel rotates out of the hot zone into a cool zone, so the desiccant can again absorb moisture.  Then inside air is passed through the wheel and dried.

Systems like this have been used in industrial applications where other process machinery generates high temperatures, so the heat is re-used.  Since the desiccant wheel would need to be heated anyway, this equals a savings in money.  They aren't used for private houses because houses typically don't have that kind of high-quality waste heat available; and they also are pretty large so there's enough capacity in the system to significantly dry the air.

In contrast, LD solutions -- typically they use something among the following:  lithium chloride, calcicum chloride, potassium formate or potassium acetate -- don't require really high temperatures to be regenerated.  In fact, they can be regenerated with systems that are very similar to (good) solar hot water heaters.  This is very attractive because typical demand coincides with lots of sunlight around.  Once your solar LD heater is built, the energy is "free".  Not quite because it has to be pumped through some other apparatus, but that doesn't take much energy to accomplish.

Most research in the field has found that lithium chloride is the most efficient LD.  It also is the most expensive so it's automatically eliminated from my consideration.  Among the rest, calcium chloride probably is the most efficient but it has some problems.  The first is that the solution, which is about 35-40% CaCl, is very corrosive so the pipes, pumps and heat exchangers used to heat and cool it have to be either plastic, stainless steel or ceramic.  This jacks up the price, at least for heat exchangers and pumps.  Of course, its corrosive nature is worse at elevated temperatures so a good design approach is to place our expensive pumps in the loop where the LD is at its lowest temperature.  This would right in front of the regenerator, which heats the LD up in order to shed the water it absorbed.  Another problem is that concentrated CaCl solutions have a very high freezing point, 40F and higher so it's necessary to keep the solution warm enough so it doesn't freeze and stop the system from working.  The other problem also is related to CaCl's  corrosive nature, and that is "carryover".  Since the dehumidifier designs have to put interior air and CaCl solution in intimate contact, there is the possibility of CaCl solution droplets being carried into the interior space, where they can corrode metal and degrade fiber -- like rugs, furniture, clothing....so the design of the absorber portion of the system is very important.  This, by the way, is another problem with the Tech Ingredients approach because they deliberately try to atomize their LD solution.   They are depending on some kind of post-absorber filtration setup, one way or another, to prevent that.  Absorbers that use air flowing at relatively high speeds are particularly susceptible to this problem.

Other LD solutions like potassium formate and potassium acetate are more benign in this regard, but they (1) aren't as efficient, (2) are more expensive; and (3) in the case of potassium acetate, its solution is reported to be very viscous so it is hard to pump it through the dehumidifier system.

It appears that the best way to prevent carryover is to use either packed-bed absorbers or so-called falling-film absorbers.  Unfortunately, the best media for packed beds is pretty expensive -- I calculated that a 1 cubic-meter absorber would require over $2,000 worth of media (basically specially-designed plastic whiffle balls).  So some kind of falling-film scheme looks best.

For developing different types of absorbers I'm planning on sampling the exit air with a high-voltage arc to ionize any calcium ions that are present, to be analyzed with (naturally, a home-made) visible-light spectrometer.  That will quickly reveal if the design has any carryover or not.

The Tech Ingredients' second design is meant to use the same LD solution to simultaneously cool the air and dehumidify it, in contrast to their first design which just dehumidifies the air entering an evaporative chiller.  However, their second design depends on an unassisted evaporative chiller to cool the LD solution -- not viable for a region that has high humidity, since the ability to cool the LD solution is limited.  The problem with their first design is sort of related, because they're using an unassisted evaporative cooler to chill the LD solution.  There are two alternatives that could improve the situation.  First, build an oversized chiller using an air pre-cooler to sorta-kinda replicate a Maisotsenko-cycle system; and use the chilled water to both cool the house and operate an LD dehumidifier's absorber in a separate system to control the house's interior humidity level.  The second is a kind of bootstrap system where the chiller is fed by an outside "feed" air flow that has been dehumidified by an LD system -- which in turn uses the same chiller water.  It's bootstrapped because as the chiller operates the dehumidifier front end, the dehumidifier becomes more and more effective -- it's helping to decrease the wet-bulb temperature because the feed air's RH is reduced by the dehumidifier, so the chiller water temperature goes down and further reduces the RH of the input air.  And so on.  I haven't found any papers that describe a system like this so at this point it is a wild guess on whether or not it is a real improvement or not.

Friday, September 9, 2022

Cycles: The Mysterious Maisotsenko Cycle

 The M-cycle is touted as a new thermodynamic cycle that will solve the world's air-conditioning problems (natch, by the companies selling them).  But is it really that good, and just how does it work?  When I look at drawings of air conditioners that use the M cycle it seems pretty confusing with all the different pieces and "wet channel" and "dry channel" stuff.  Not to easy to figure out, perhaps deliberatly so.  But by looking more closely at our trusty Psychometric chart things start to become much clearer.

If you have looked at my previous blog posts on DIY A/C you have already seen this:


It shows the different "paths" taken by evaporative coolers (solid line) and the more common compressor-based A/C systems, shown by the dotted line.

Suppose we sort of combine them.  Let's add a special type of heat exchanger, very similar to what's called an HRV, a Heat Recovery Ventilator.  It is an air-to-air heat exchanger used to replace stale air inside a house with fresh exterior air, while recovering the heat contained in the exhaust air.  They typically are cross-flow devices that use stacked corrugated plastic sheets -- the interior air flows across the outside surfaces of the sheets and the exterior air flows at right angles through the channels formed by the corrugations.  Or vise-versa, makes no difference.  This is a simplification because the air paths have to be kept separated so they only exchange heat -- they can't mix.  I have seen a number of DIY versions so making your own HRV is definitely feasible.  The biggest problem is that the corrugated plastic sheets are somewhat expensive, but I think I can make a similar kind of device using corrugated metal roofing with insulated panels on each side to force the air to flow down the corrugations.  It's much less expensive but (probably) more bulky.  Since it would be a type of counterflow system rather than the conventional cross-flow of other HRV's it might be pretty efficient.  The corrugated-roofing approach will likely be the subject of another blog.  For now, I just need to point out that making your own air-to-air HRV is not much of a stretch for an intrepid DIYer.

So, let's place our home-made HRV inline with our home-made evaporative chiller.  The chiller's input air comes from the output of one of the HRV channels, and the chiller's output air is routed to the other HRV's channel.  In this way the input air to the chiller is cooled before it enters it.

This might seem like a waste of a perfectly good HRV because we know that the RH of the cooled air increases, which decreases the effectiveness of our chiller.  And so it does, but that is more than offset by the attendent decrease in the resultant wet-bulb temperature.  I can show that by modelling our new system in a stepwise manner, like this:

Step 1:  We turn our chiller on.  The air entering it is at ambient temperature.  The air passing through the chiller follows the solid-line path on the psychrometric chart, and exits at a temperature close to the wet-bulb temperature.  It won't be equal to the wet bulb temperature because chillers aren't 100% efficient at transferring the full temperature drop of the water to the air.  Let's say that the chiller is 90% effective at that, so the air exits at 22.3C.  From there, it passes through the HRV, cooling the air entering the chiller.  Let's say that the HRV also is 90% efficient.  That translates to the chiller getting air that's been cooled to 23.3C.

Step2:  The chiller further cools the 23.3C air.  Looking at our psychrometric chart, we follow the dotted line over to where it intersects the 23.3C point on our temperature axis and see that the wet bulb temperature now is 18.5C.  This is almost 5 degrees Fahrenheit lower than the exit air we got in step 1.

Let's do one more step, just to see what happens.

Step 3:  Given the same efficiencies of our chiller and HRV, the air entering the chiller now is at 20.3C, giving us a wet-bulb temperature of 17.5C.  This is a further temperature reduction of 1 degree Centigrade, for an overall improvement of 6.7F.  Assuming the same efficiencies as before, the ambient air at 90F has been cooled to 64.4F.  For comparison, a single-pass chiller would output air at about 74F.

If we model our system in a continuous rather than stepwise manner we will find that the chiller's exit air asymtotically approaches the dew point, which is about 15C.  It will never get there because we have to evaporate SOME water to get any kind of cooling at all.  And in a real-world A/C system using this approach there will be significant heat input from the house we are trying to keep cool.

I think this is the basis of M-cycle air conditioning.  One additional wrinkle is that the M-cycle messes around with the relative volumes of air (via the Wet and Dry channels) so the cooled air delivered to living space isn't as humid as it would be in my example above.  However, since I'm going to run the chilled water through a water-air heat exchanger placed inside the house, I don't need to worry about the RH of the air exiting my DIY M-like  A/C system.  Just water leaks, perhaps from condensation on the heat exchanger (HX for short).

A system like this, unlike a compressor-based system, does little to nothing to address the increased RH due to the temperature drop.  However, there are ways to address this, also in a DIY manner that I will describe in yet another blog post.  It uses calcium chloride, but not as a one-shot "dry-z-air" type of system.  That's all I will say for now on that subject.  It gets complicated when we throw in dehumidification.

To summarize, we can noticeably improve the effectiveness of an evaporative cooler by adding a relatively simple air-to-air heat exchanger to the air flows entering and exiting the chiller.  

A do-able DIY system would likely be an indirect-cooled one, where the cold water in the chiller would be pumped through a water-air HX inside the house.  The HRV could be made from either a stack of metal sheets ($$$), corrugated plastic sheets ($$) or -- perhaps -- corrugated metal roofing panels ($).  In addition to cost, those options are approximately ranked in order of their physical size.  I'm guessing about the use of the corrugated metal but I think it's likely to take the most room.  However, it will be outside the house so that will be less of an issue.  If need be I think it's possible to stack the metal panels so we still get decent HRV efficiency in a smaller space.  The HRV design will be more complicated but, again, feasible for a good DIYer to make.

The chiller design also will be more complicated because the supply air has to come from our HRV and its exit air has to be routed back into the HRV.  My original open-sided design would have to be put in a sealed box that (1) provides for relatively unrestricted air flow and (2) keeps the input and output air flows well separated.  The four-sided tower scheme might have to go away.  A chiller using a single evaporation pad would be very easy to make (just a box with the chiller in the center), but would have to be pretty big to have the same surface area as the tower.  Maybe a set of pads placed in a "W" pattern?  How do I get water to them without introducing air leaks?  And just how much surface area do I need for the pad(s), anyway?  Does the enclosure need to be insulated? Time to do some thinking and sketching..


Wednesday, September 7, 2022

Evaporative Cooling Vs. Compressor-Driven A/C

In this post I'm going to explain more concerning what I've learned these two types of air conditioning.  In case new readers are wondering why I'm interested in evaporative cooling, it's because the technology is pretty easy to build yourself -- but with that, there are definite limitations that come along with it.

The psychrometric chart shown below has been marked to illustrate the two different kinds of cooling.  I'll then discuss some interesting differences between them.



I've drawn a solid line and a dotted line.  They both start at the same point, 32.2C (approximately 90F) and 36% relative humidity (RH).  That was the outside afternoon temperature at our house a week or two ago.   The solid line is drawn along a constant-enthalpy line, which just means that the total energy of the system remains constant.  Note that the relative humidity increases and so does the humidity ratio (basically, the amount of water in the air air, shown on the right side of the chart).  This shows what's going on when evaporative cooling is taking place.  You might think that this mechanism can't occur without a change in energy because the air is being cooled:  but that is balanced by the energy carried away by the water as it changes from a liquid to a gas.  To distinguish the two "forms" of heat, the energy contained by the air (oxygen, nitrogen and a small amount of carbon dioxide) is called "sensible heat".  Possibly because it can be "sensed" by a thermometer?  I haven't investigated the origins of the name so that is just a guess.  And the energy contained by the water vapor is called "latent heat", because it only plays a role when the water either evaporates or condenses.  Latent heat is a big deal in the A/C world because in humid climates it can be a substantial contributor to the energy (as in, coming out of a wall socket) needed to cool and condition air.

The humidity ratio for evaporative cooling increases because evaporating water is being used to cool the air, so the amount of water in the air increases.  So the latent heat increases, balancing the sensible heat drawn out of the air:  so the overal energy (enthalpy) remains constant.

The problem with so-called "swamp coolers" is that they are not very effective in humid climates, for two reasons.  As the humidity increases, the wet-bulb temperature increases so the chiller can't deliver air that's much colder than what entered it.  And the second problem is that the chiller increases the relative humidity of the air that exits it.  This reduces our body's ability to cool itself via evaporative cooling, so our perception of comfort is reduced

Now lets move on to the solid horizontal line.  That is what is going on when conventional compressor-driven cooling occurs.  The line follows a constant-humidity line because the amount of water in the air doesn't change.  Since there is no phase change, at least down to the dew point, the total energy in the air decreases:  the enthalpy decreases.  However, closer examination of the line shows that the relative humidity increases.  This is because cooler air has a reduced capacity to hold water vapor.  When the temperature reaches the dew point (at 15C/59F), the relative humidity reaches 100% and water starts to condense.   It takes a LOT of energy to condense water so once that happens it suddenly takes a bigger A/C unit to get the temperature to decrease.  The other factor that comes into play is our perception of comfort when humid air is cooled.  59F is pretty chilly, so let's say we just cool the air down to 68F (20C).  Our chart indicates that the air's relative humidity now is about 75%.  This is pretty humid so we don't feel all that comfortable -- our body's ability to cool itself is reduced because we can't cool ourselves as effectively due to _our_ evaporative cooling.  At the dew point our body can't cool itself at all via sweating so 59F would actually feel very uncomfortable.  The other downside to high humidity is that it promotes the growth of mold and mildew, steel parts rust and so on.

To improve the comfort level, most A/C systems deliberately cool the air to the dew point in order to force it to condense.  The cool air exiting the A/C unit has a lower relative humidity due to the condensation.  But now we have the reverse problem -- the air feels TOO cold for comfort.  How many of us have had the misfortune to be seated at a restaurant directly below an air conditioner vent?  Feels pretty cold, huh.  Well, it actually could be worse because commercial systems actually use extra energy to deliberately WARM that cold air back up some.  More sophisitcated A/C systems can recycle the heat they extract from the incoming air via a heat exchanger so the energy cost is lower:  but the cost of such an A/C system is higher.  

Here's a factoid.  An A/C system that returns all the heat energy back to the interior space it's in might seem ridiculous because it doesn't cool the room -- but it DOES reduce the relative humidity.  This type of system is called a dehumidifier.

So on the one hand we have evaporative cooling systems that work well in very dry climates but become less and less effective as humidity increases.  Unfortunately, in many parts of the world high temperatures are accompanied by high humidity so they aren't nearly as prevalent as compressor-driven A/C systems.  

In contrast, compressor based A/C can dry the air too much if it's used in dry areas of the world; and in humid areas a large percentage of the energy they consume is just used to pull water out of the air.  In hot humid locations, the energy consumed by A/C can be a large percentage of the total energy consumption of a household.

Both systems have their advantages and disadvantages, so it's no surprise that there still is considerable research and development going on to mitigate the disadvantages.  I'll go over some of those efforts in furture blog posts on the subject.