There are two lapse rates to consider for aviation.
The first and most important to pilots is the DRY adiabatic lapse rate, typically about 10 degrees celcius per 3000 meters or about 1.8 degrees farehheit per 1000 feet of altitude. That is the rate you need to be aware of for most of your flying. This is the theoretical lapse rate at zero relative humidity, something which never occurs in nature but is used by meteoroligists to express a standard temperature drop with altitude.
The MOIST adiabatic lapse rate is another theoretical value of temperature change with altitude in clouds, where the relative humidity is 100%. Within clouds the lapse rate is always significantly less than the dry adiabatic lapse rate.
Happy landings, Dave!
I preface my comments with the statement that I am not an expert on aviation weather by any stretch of the imagination. However, I do recall some of these concepts from my PPL training days ... The phenomenon that you are describing (temperature decrease with an increase in altitude) is known as the lapse rate. There are several different kind of lapse rates to consider which is where some of your differing numbers may be coming from ... The [B]environmental lapse rate[/B] is the change in temperature with altitude for the stationary atmosphere (i.e. - the temperature gradient). The average value for the environmental lapse rate is 2 degrees C (3.5 degrees F) per 1000 ft. However this assumes a standard atmosphere with stable air where the temperature falls steadily with altitude. It is important to understand that it is often rare to have this steadiness in the atmosphere. The level of moisture in the atmosphere as well as other phenomena can create temperature inversion layers where the temperature of the air actually increases with altitude at times. The important point to remember is that this is always considered a rough order estimate. A cool experiment I used in analyzing my weather data before a flight was to record the temperature, dew point, and lowest cloud layer level. The point where clouds begin to form is the point where the air is completely saturated (temperature equals dew point). If you take the difference between thte temperature and the dew point (in degrees C), divide it by the standard lapse rate of (2 degrees C), and then multiply by 1000 ft, this will give you the estimated level of the clouds AGL. Comparing this calculated value to the information in your METAR report will give you a good idea about the stability of the atmosphere (whether the standard lapse rate is in effect). More often than not, the calculated cloud layer is pretty close ... it's cool how science and math works sometimes :-) By the way, as an aside, the other two defined lapse rates are the dry adiabatic lapse rate and the moist adiabatic lapse rate which refers to the temperature of an air mass as it moves upward. As you might expect, the dry rate refers to a dry mass of air and the moist rate refers to a moist mass of air (the value can vary drastically depending upon the moisture content of the air). Hope this helps answer some of your questions and not confuse you even more :-) Feel free to ask for clarification if anything is unclear. Clear Skies, Dave
the air temperature decreases by an average of 6.5°C per km (3.6°F per 1,000 ft) above the Earth’s surface and temperatures as low as -60°C (-76°F) occur at the top of the troposphere.