Physiologists have found that the temperature and humidity of the environment directly affect a person's thermoregulation and heat conduction. This, in turn, influences a person's level of physical adaptation, which in turn affects mental agility and mental state, thus impacting learning and work efficiency. Experimental analysis has shown that the most suitable room temperature for the human body is 18°C, and the humidity should be between 40% and 60%. In people's daily lives and work, many different places and environments have specific requirements for temperature and humidity; therefore, proper temperature and humidity control is essential.
Temperature and humidity sensor
Since temperature and humidity are closely related both as physical quantities and in people's daily lives, integrated temperature and humidity sensors have been developed.
A temperature and humidity sensor is a device that converts temperature and humidity values into electrical signals that are easily measured and processed. Most temperature and humidity sensors on the market measure temperature and relative humidity.
Temperature and humidity
Let's first understand some of the physical quantities of temperature and humidity:
temperature
The temperature of an object is a physical quantity used to measure its hotness or coldness, and it is one of the seven base quantities in the International System of Units (SI). Many physical phenomena and chemical processes occur at specific temperatures in production and scientific research, and they are also closely related to people's lives.
humidity
Humidity has long been closely related to daily life , but it is difficult to express it quantitatively.
The most commonly used physical quantity to express humidity in daily life is relative humidity, expressed as %RH. Relative humidity is closely related to temperature in its derivation. For a given volume of sealed gas, the higher the temperature, the lower the relative humidity; conversely, the lower the temperature, the higher the relative humidity. This involves complex thermodynamic engineering knowledge.
Some definitions related to humidity:
relative humidity
In metrology, humidity is defined as "a quantity of the state of matter." In everyday life, humidity refers to relative humidity, expressed as RH%. In short, it is the percentage of water vapor content (water vapor pressure) in a gas (usually air) compared to the saturated water vapor content (saturated water vapor pressure) of air under the same conditions.
absolute humidity
Absolute humidity refers to the actual amount of water vapor contained in a unit volume of air, usually measured in grams. Temperature has a direct impact on absolute humidity; generally, the higher the temperature, the more water vapor is released, and the higher the absolute humidity; conversely, the absolute humidity is lower.
Saturated humidity
At a given temperature, saturation humidity is the maximum amount of water vapor that a unit volume of air can hold. If this limit is exceeded, the excess water vapor will condense into water droplets, and the air humidity at this point is called saturation humidity. Saturation humidity is not constant; it varies with temperature. The higher the temperature, the more water vapor a unit volume of air can hold, and the higher the saturation humidity.
Dew point
Dew refers to the phenomenon where air containing a certain amount of water vapor (absolute humidity) becomes saturated with water vapor (saturated humidity) when the temperature drops to a certain level, and the water vapor begins to condense into water. The temperature at which water vapor begins to condense into water is called the "dew point temperature," or simply "dew point." If the temperature continues to drop below the dew point, the supersaturated water vapor in the air will condense into water droplets on the surface of objects. Furthermore, wind is closely related to the temperature and humidity of the air and is also one of the important factors affecting changes in air temperature and humidity.
Types of temperature sensors
Temperature sensors can be classified into contact and non-contact types based on their detection methods.
Contact temperature sensor
A contact temperature sensor, also known as a thermometer, has its sensing part in good contact with the object being measured.
Thermometers achieve thermal equilibrium through conduction or convection, allowing the thermometer reading to directly represent the temperature of the object being measured.
Generally, thermometers offer high measurement accuracy. Within a certain temperature range, they can also measure the internal temperature distribution of an object. However, they can produce significant measurement errors for moving objects, small targets, or objects with very small heat capacity. Commonly used thermometers include bimetallic thermometers, liquid glass thermometers, pressure thermometers, resistance thermometers, thermistors, and thermocouples.
They are widely used in industry, agriculture, commerce, and other sectors. People also frequently use these thermometers in their daily lives.
Non-contact temperature sensor
Its sensing element does not contact the object being measured, hence it is also known as a non-contact temperature measuring instrument. This type of instrument can be used to measure the surface temperature of moving objects, small targets, and objects with small heat capacity or rapid (transient) temperature changes. It can also be used to measure the temperature distribution of a temperature field.
The most commonly used non-contact temperature measuring instruments are based on the fundamental law of blackbody radiation and are called radiation temperature measuring instruments.
Radiation thermometry includes the luminance method (see optical pyrometer), the radiation method (see radiation pyrometer), and the colorimetric method (see colorimetric thermometer).
Various radiation thermometry methods can only measure the corresponding photometric temperature, radiation temperature, or colorimetric temperature. Only the temperature measured for a blackbody (an object that absorbs all radiation and does not reflect light) is the true temperature. To determine the true temperature of an object, corrections must be made for the material's surface emissivity. However, the surface emissivity of a material depends not only on temperature and wavelength but also on surface condition, coating, and microstructure, making it difficult to measure accurately.
In automated production, radiation thermometry is often used to measure or control the surface temperature of certain objects, such as the rolling temperature of steel strips, the temperature of rolls, the temperature of forgings, and the temperature of various molten metals in smelting furnaces or crucibles in metallurgy.
Advantages of non-contact temperature measurement: The upper limit of measurement is not limited by the temperature resistance of the sensing element, so there is no limit to the highest measurable temperature in principle. For high temperatures above 1800℃, non-contact temperature measurement methods are mainly used. With the development of infrared technology, radiation thermometry has gradually expanded from visible light to infrared light, and it has been used for temperatures below 700℃ up to room temperature, with very high resolution.
Temperature sensors can be categorized into the following types based on their working principle:
Sensor designed based on the principle of metal expansion
Metals expand when the ambient temperature changes, and sensors can convert this response into signals in different ways.
Bimetallic strip sensor
A bimetallic strip consists of two metals with different coefficients of thermal expansion bonded together. As the temperature changes, material A expands more than the other metal, causing the strip to bend. The curvature of the bend can be converted into an output signal.
Bimetallic rod and metal tube sensor
As the temperature rises, the length of the metal tube (material A) increases, while the length of the non-expanding steel rod (metal B) does not increase. Thus, the linear expansion of the metal tube due to the change in position can be transmitted. Conversely, this linear expansion can be converted into an output signal.
Sensors designed based on the deformation curves of liquids and gases
When the temperature changes, both liquids and gases will change in volume accordingly.
Various types of structures can convert this expansion change into a position change, thus producing a position change output (potentiometer, sensing deviation, baffle, etc.).
Thermistor temperature sensor
Thermistors are made of semiconductor materials, most of which have a negative temperature coefficient, meaning their resistance decreases as temperature increases.
Temperature changes cause significant resistance shifts, making it the most sensitive temperature sensor. However, thermistors have extremely poor linearity and are highly dependent on the manufacturing process. Manufacturers do not provide standardized thermistor profiles.
Thermistors are extremely small and respond quickly to temperature changes. However, they require a current source, and their small size makes them extremely sensitive to self-heating errors.
Thermistors measure absolute temperature on two wires, offering good accuracy, but they are more expensive than thermocouples and have a smaller measurable temperature range. They are well-suited for current control applications requiring rapid and sensitive temperature measurement. Their small size is advantageous for space-constrained applications, but self-heating errors must be carefully prevented.
Thermistors also have their own measurement techniques. Their small size is an advantage; they stabilize quickly and do not create a thermal load. However, this also makes them quite fragile, as high current can cause them to heat up. Because a thermistor is a resistive device, any current source will generate heat due to power. Therefore, a small current source must be used. Exposing a thermistor to high heat will result in permanent damage.
Thermocouple temperature sensor
A thermocouple consists of two metal wires of different materials soldered together at their ends. When one end of the thermocouple is heated, a potential difference is created in the thermocouple circuit. Temperature can be calculated by measuring this potential difference. Because it requires two conductors of different materials, it is called a thermocouple. Thermocouples made of different materials are used in different temperature ranges, and their sensitivities also vary. The sensitivity of a thermocouple refers to the change in output potential difference when the temperature at the heating point changes by 1°C. For most thermocouples supported by metallic materials, this value is approximately between 5 and 40 microvolts per degree Celsius (µV/°C).
Because the sensitivity of thermocouple temperature sensors is independent of the material's thickness, temperature sensors can be made from very fine materials. Furthermore, due to the excellent ductility of the metal materials used to make thermocouples, these tiny temperature-sensing elements have extremely high response speeds, enabling them to measure rapidly changing processes.
Thermocouples are the most commonly used temperature sensors in temperature measurement. Their main advantages are a wide temperature range and adaptability to various atmospheric environments; they are also robust, inexpensive, require no power supply, and are the cheapest option. While thermocouples are the simplest and most versatile temperature sensors, they are not suitable for high-precision measurements and applications.
Types of humidity sensors
Humidity sensors have two types of humidity-sensitive elements: resistive and capacitive.
A humidity-sensitive resistor is characterized by having a film made of a moisture-sensitive material coated on a substrate. When water vapor in the air is adsorbed onto the moisture-sensitive film, the resistivity and resistance of the element change. This characteristic can be used to measure humidity.
Humidity-sensitive capacitors are generally made of polymer film capacitors, with common polymer materials including polystyrene, polyimide, and cellulose acetate butyrate. When the ambient humidity changes, the dielectric constant of the humidity-sensitive capacitor changes, causing its capacitance to change as well. The change in capacitance is directly proportional to the relative humidity.
Common humidity measurement methods include: dynamic methods (dual-pressure method, dual-temperature method, shunting method), static methods (saturated salt method, sulfuric acid method), dew point method, wet-bulb and dry-bulb method, and various electronic sensor methods.
Dynamic measurement
The dual-pressure and dual-temperature methods are based on the thermodynamic P, V, T equilibrium principle, which requires a relatively long equilibrium time. The split-flow method is based on the precise mixing of absolutely moist and absolutely dry air. Due to the use of modern measurement and control methods, these devices can be made quite precise. However, because the equipment is complex, expensive, and time-consuming to operate, they are mainly used for standard metrology, with a measurement accuracy of ±2%RH to ± 1.5 %RH.
Static measurement
The saturated salt method, a static method, is the most common and simple method for humidity measurement. However, it has very strict requirements for the equilibrium of the liquid and gas phases and for the stability of the ambient temperature. It requires a long time to reach equilibrium, and even longer at low humidity points. Especially when there is a large difference between the indoor humidity and the humidity inside the container, each time it is opened, it requires 6-8 hours to reach equilibrium.
Dew point measurement
The dew point method measures the temperature at which moist air reaches saturation. It is a direct result of thermodynamics, offering high accuracy and a wide measurement range. Precision dew point meters used for metrology can achieve an accuracy of ± 0.2 ℃ or even higher. However, modern cold mirror dew point meters based on photoelectric principles are expensive and are often used in conjunction with standard humidity generators.
Dry and wet measurement
The wet-bulb and dry-bulb method, invented in the 18th century, is a long-established and widely used method for measuring humidity. It's an indirect method, calculating humidity using the wet-bulb and dry-bulb equation, which is conditional: the wind speed near the wet bulb must be at least 2.5 m/s. Common wet-bulb and dry-bulb thermometers simplify this condition, resulting in an accuracy of only 5-7% RH , significantly lower than electronic humidity sensors. Clearly, the wet-bulb and dry-bulb method is not a static method; one should not simply assume that improving the accuracy of the two thermometers equates to improving the accuracy of the hygrometer.
Electronic humidity sensor
Electronic humidity sensors can achieve an accuracy of 2-3%RH, which is higher than that of wet-bulb and dry-bulb humidity measurements.
Humidity sensors have poor linearity and poor resistance to contamination. When measuring ambient humidity, the sensors are exposed to the environment for extended periods, making them susceptible to contamination and affecting measurement accuracy and long-term stability. In this respect, wet-bulb and dry-bulb humidity measurement methods are superior.
Selection of temperature and humidity sensors
When selecting a temperature and humidity sensor, the following points should be noted:
① Select the measurement range
Similar to measuring weight and temperature, the first step in selecting a humidity sensor is to determine its measurement range. Aside from meteorological and research departments, those involved in temperature and humidity monitoring and control generally do not require full humidity range (0-100%RH) measurements.
② Select measurement accuracy
Measurement accuracy is the most important indicator of a humidity sensor. Every percentage point improvement represents a significant leap forward, even a change in grade. Because the manufacturing costs and selling prices vary greatly depending on the required accuracy, users must choose a sensor that best suits their needs and avoid blindly pursuing "high-end, precise, and cutting-edge" technologies. For example, when using a humidity sensor at different temperatures, the effect of temperature drift must be considered.
As is well known, relative humidity is a function of temperature, and temperature significantly affects the relative humidity within a specified space. A change of 0.1 °C in temperature will result in a 0.5 % RH change in humidity (error). If it is difficult to maintain a constant temperature in the application environment, then demanding excessively high humidity measurement accuracy is inappropriate. In most cases, if precise temperature control methods are unavailable, or if the measured space is not sealed, an accuracy of ±5% RH is sufficient.
For applications requiring precise control of temperature and humidity in localized spaces, or where real-time monitoring of humidity changes is necessary, a humidity sensor with an accuracy of ±3%RH or higher should be selected. An accuracy exceeding ±2%RH is difficult even for standard humidity generators used to calibrate sensors, let alone the sensor itself. Achieving 2%RH accuracy with relative humidity measuring instruments, even at 20-25℃, is still very challenging. The characteristics given in product documentation are typically measured at room temperature (20℃±10℃) and in clean air.
③ Consider time drift and temperature drift
In practical use, due to the effects of dust, oil, and harmful gases, electronic humidity sensors will age and their accuracy will decrease over time. The annual drift of electronic humidity sensors is generally around ±2%, or even higher. Typically, manufacturers will specify that a single calibration is valid for one or two years, after which recalibration is required.
④ Other precautions
Humidity sensors are not hermetically sealed. To protect the accuracy and stability of measurements, use should be avoided in acidic, alkaline, or organic solvent-containing atmospheres. Also avoid use in dusty environments.
To accurately reflect the humidity of the space being measured, sensors should not be placed too close to walls or in poorly ventilated corners. If the room being measured is too large, multiple sensors should be placed.
Some humidity sensors have high power supply requirements; otherwise, measurement accuracy will be affected. Alternatively, sensors may interfere with each other or even fail to function. When using these sensors, a suitable power supply that meets the accuracy requirements should be provided according to the technical specifications.
When sensors need to transmit signals over long distances, attention should be paid to signal attenuation. When the transmission distance exceeds 200m, it is recommended to use a humidity sensor with a current output signal.
The future market for temperature and humidity sensors, especially in consumer electronics and the Internet of Things (IoT), holds great promise. Small-sized, low-power, low-cost, and highly integrated IC semiconductor temperature and humidity sensors will see wider adoption.