when comparing heat pumps, and you look at the manufacturers performance charts, one is for heating , one is for cooling.
For the “cooling one” what is the standard temps that a units compared at??
choices are
1. left side – (outdoor dry bulb temps) of 75, 80, 85, 9,95,100, 105,110, 115
2. across the top WBe, 71, 67, 63
what is the standard? 95 and 67??
Replies
In most cases, 95 deg F (DB) is what you want to use to compare.
This temperature relates to the outdoor design temperture that your condensing unit (compressor and condenser coil) sees during the summer time peak. If you live in a climate where it is considerably hotter than 95 deg. F for most of the season (like the middle of the desert) then you may want to use the rating for 105 deg. F. You will see a substancial reduction in capacity at the higher ambient temperatures.
The other temps you questioned are I believe related to the indoor design temps. In this case most likely the wet bulb (wb) which relates to the indoor humidity levels.
Hope this helps.
Thanks
and which indoor wb is the standard?
63 67 0r 71?
Standard conditions under which most mechanical cooling equipment is usually rated is ARI standard conditions of 95 degF outside dry bulb, 80 degF inside dry-bulb, 67 degF inside wet bulb.
Thanks.
My state's heat pump council says:
In SC's climate, it is recommended that at least 25% of the total cooling load be dedicated to moisture removal. This means the Sensible Heat Ratio should be 75% or less. The lower the number in the Sensible Heat Ratio, the more of your load is used to remove moisture. There are high SEER units with high Sensible Heat Ratios, meaning they are more efficient but little of the load is dedicated to removing moisture. And there are lower SEER units with lower Sensible Heat Ratios, meaning they are less efficient but more of the load is used for moisture removal.
tahts why i asked.
thanks
Wain,
SEER vs S/T is only an appearant correlation. It is possible to have a unit with a high SEER and a decent S/T, like 0.75. The reasons are not simple or quick to explain, but, IMHO, SEER should be secondary in determining which unit or component to use, especially if humidity control is important.
Manufacturers learned that they could manipulate the way systems operated to get a better SEER (seasonal energy efficiency rating) without actually producing a more efficient unit. ARI, the American Refrigeartion Institute, has established a base set of conditions, but those conditions for testing and rating of HVAC equipment are very simplistic, for practical reasons. An SHR of 0.75 in Las Vegas, vs Columbia, would make no sense. What the mfrs do is raise the temperature of the refrigerant in the cycle. This raises the temp of the evap coil (less moisture condenses, i.e. less latent cooling) and makes the compressor work less for same total cooling. It is a game at which engineers are skilled. A consumer that is not real familiar with refrigeration sees a higher SEER (and is willing to pay a higher $$$) and thinks more efficient is better (which is the intent of the SEER) but really doesn't get what they pay for.
Thanks i am in Charleston sc and see the need for humidity removal -
some nights you go outside and its 80 degrees and 90% or whatever RH and it feels terrible,
you go inside and the radio shack meter sez its 78 degrees and 50% RH and it feels great.