Fiveable

๐Ÿ’งFluid Mechanics Unit 15 Review

QR code for Fluid Mechanics practice questions

15.2 Pressure and Temperature Measurements

15.2 Pressure and Temperature Measurements

Written by the Fiveable Content Team โ€ข Last updated August 2025
Written by the Fiveable Content Team โ€ข Last updated August 2025
๐Ÿ’งFluid Mechanics
Unit & Topic Study Guides

Pressure Measurement Devices

Pressure measurement is one of the most fundamental tasks in fluid mechanics and chemical engineering. Accurate pressure data drives everything from flow calculations to equipment safety, so understanding how each device works (and when to use it) matters a great deal.

Principles of pressure measurement devices

Manometers balance fluid pressure against a liquid column. The height difference in the liquid directly indicates the pressure difference, governed by the hydrostatic equation ฮ”P=ฯgh\Delta P = \rho g h.

  • U-tube manometer โ€” A U-shaped tube filled with a manometric fluid (typically mercury or water). You apply pressure to one side, and the height difference hh between the two legs gives you the pressure difference. Simple, no moving parts, and very reliable for moderate pressure ranges.
  • Inclined manometer โ€” Same principle as the U-tube, but one leg is tilted at an angle. This spreads a small vertical height change over a longer readable length, giving you much better resolution for small pressure differences (common in HVAC and low-velocity airflow work).

Pressure gauges measure pressure through mechanical deformation of a sensing element, then translate that deformation into a dial reading.

  • Bourdon tube gauge โ€” Contains a curved, hollow metal tube sealed at one end. When internal pressure increases, the tube tends to straighten. That motion is linked through a gear mechanism to a pointer on a calibrated dial. These are the most common industrial pressure gauges.
  • Diaphragm gauge โ€” Uses a thin, flexible membrane that deflects under applied pressure. The deflection is measured mechanically or optically and converted to a pressure reading. Diaphragm gauges handle low pressures well and are a good choice when the process fluid is corrosive, since the diaphragm can act as an isolation barrier.

Pressure transducers convert pressure into an electrical signal, making them essential for automated process control and data logging.

  • Strain gauge transducer โ€” A diaphragm deforms under pressure, and bonded strain gauges on its surface change electrical resistance proportionally. The resistance change is measured (usually via a Wheatstone bridge) and converted to a pressure value.
  • Capacitive transducer โ€” A diaphragm sits between two fixed plates, forming a capacitor. Pressure deflects the diaphragm, changing the gap and therefore the capacitance. These offer excellent sensitivity and are common in precision applications.
  • Piezoelectric transducer โ€” A piezoelectric crystal (e.g., quartz) generates an electrical charge when mechanically stressed by pressure. Because the response is very fast, these are best suited for dynamic pressure measurements like pressure pulsations, transient events, or combustion studies. They are not ideal for static pressure measurement since the charge dissipates over time.
Principles of pressure measurement devices, Gauge Pressure, Absolute Pressure, and Pressure Measurement | Physics

Temperature Measurement Techniques

Temperature measurement in fluid systems tells you about energy content, reaction conditions, and thermal behavior. The three most common approaches each have distinct strengths.

Principles of pressure measurement devices, Gas Pressure โ€“ Introductory Chemistry โ€“ Lecture & Lab

Temperature measurement techniques

Thermocouples are the workhorses of industrial temperature measurement. They consist of two wires made of dissimilar metals, joined at one end (the hot junction).

  • A temperature difference between the hot junction (in the process) and the reference junction (cold junction, at a known temperature) generates a small voltage. This is the Seebeck effect.
  • That voltage is measured and converted to temperature using standard thermocouple reference tables or polynomial calibration equations.
  • Different thermocouple types cover different ranges and environments:
    • Type K (chromel-alumel): general purpose, roughly โˆ’200ยฐC-200ยฐC to +1250ยฐC+1250ยฐC
    • Type J (iron-constantan): good for reducing atmospheres, up to about +760ยฐC+760ยฐC
    • Type T (copper-constantan): excellent accuracy at low temperatures, roughly โˆ’200ยฐC-200ยฐC to +350ยฐC+350ยฐC
  • Thermocouples are inexpensive, rugged, and cover a very wide temperature range, but their accuracy (typically ยฑ1โ€“2ยฐC\pm 1\text{โ€“}2ยฐC) is lower than RTDs.

Resistance Temperature Detectors (RTDs) measure temperature through the change in electrical resistance of a metal element, most commonly platinum (Pt100 or Pt1000 sensors).

  • Resistance increases with temperature in a predictable, nearly linear fashion.
  • A Wheatstone bridge circuit is commonly used to measure the small resistance changes accurately. Three-wire and four-wire configurations help eliminate lead-wire resistance errors.
  • RTDs offer high accuracy (ยฑ0.1ยฐC\pm 0.1ยฐC or better) and excellent long-term stability, making them the preferred choice when precision matters. Their main limitations are slower response time and higher cost compared to thermocouples.

Infrared (IR) thermometers measure temperature by detecting the infrared radiation emitted by an object's surface.

  • This is a non-contact technique, which makes it ideal for measuring moving objects, electrically live components, or surfaces in hazardous environments where you can't install a probe.
  • Accurate readings require knowing the emissivity of the target surface. Emissivity is the ratio of energy radiated by the object to that radiated by a perfect blackbody at the same temperature. It varies with material, surface finish, and temperature. A polished metal surface, for example, has low emissivity and will give misleadingly low readings if you don't compensate.

Accuracy factors in measurements

Getting a good sensor isn't enough on its own. How you install, calibrate, and protect it determines whether your data is trustworthy.

Calibration should be performed regularly against known standards traceable to national or international references (e.g., NIST). Without periodic calibration, sensor drift can introduce systematic errors that go unnoticed.

Environmental factors can corrupt both pressure and temperature readings:

  • Temperature swings cause thermal expansion or contraction of pressure-sensing elements. Temperature-compensated transducers include internal correction circuits to minimize this effect.
  • Ambient temperature fluctuations influence temperature sensors themselves. Proper insulation and shielding of sensor leads and connections reduce this impact.

Sensor placement and installation are critical for getting measurements that actually represent the process:

  • Pressure sensors should be located away from flow disturbances like bends, valves, or partially open fittings, which create localized pressure variations that don't reflect the bulk flow.
  • Temperature sensors should be in direct contact with the process fluid. When a thermowell is used (a protective pocket welded into the pipe), adequate immersion depth is necessary to minimize stem conduction errors, where heat travels along the sensor body and biases the reading toward ambient temperature.

Harsh process conditions also matter. Extreme temperatures, high pressures, and corrosive or abrasive media degrade sensor performance over time. Selecting appropriate materials (e.g., Hastelloy or tantalum for corrosive service) and ensuring the sensor's rated operating range covers your process conditions are essential for long-term reliability.

Signal integrity is the final link in the chain. Proper shielding, grounding, and signal conditioning (amplification and filtering) prevent electrical noise from degrading your measurement. Digital communication protocols like HART or Fieldbus further improve signal reliability over long cable runs.

Selection of measurement devices

Choosing the right pressure or temperature device involves balancing several factors. Here's a structured approach:

  1. Assess process fluid properties

    • Chemical compatibility of wetted sensor materials with the fluid
    • Operating temperature and pressure ranges
    • Presence of particulates, bubbles, or contaminants that could foul or damage the sensor
  2. Define accuracy and precision requirements

    • Determine what measurement uncertainty is acceptable for your application
    • Consider how measurement errors propagate into process control decisions and product quality
  3. Evaluate environmental and process conditions

    • Ambient temperature and pressure fluctuations at the installation point
    • Vibration, mechanical shock, or other stresses
    • Corrosive, abrasive, or hazardous environments may require special alloys, coatings, or explosion-proof housings
  4. Check installation constraints

    • Available physical space for the sensor and any associated fittings
    • Accessibility for routine maintenance and calibration
    • Whether remote monitoring or wireless communication is needed
  5. Weigh cost against lifecycle needs

    • Initial purchase cost
    • Long-term calibration and maintenance expenses
    • Expected sensor lifespan under your specific process conditions
  6. Consult industry standards

    • ASME PTC 19.2 for pressure measurement best practices
    • ASTM E230 for thermocouple reference tables and tolerances
    • Regulatory requirements for hazardous areas (e.g., ATEX, IECEx) or food/pharmaceutical processing (e.g., 3-A sanitary standards)