Remote Sensing Fundamentals
Remote sensing captures data about Earth's surface and atmosphere without direct contact. It works by detecting electromagnetic radiation that objects reflect, absorb, or emit. Because different materials interact with radiation in unique ways, scientists can identify and analyze features from orbit or aircraft. This technique underpins everything from deforestation tracking to air quality monitoring.
Electromagnetic Radiation and Spectral Signatures
The electromagnetic spectrum spans the full range of radiation wavelengths, from short-wavelength gamma rays to long-wavelength radio waves. Remote sensing typically works with visible light, infrared, and microwave portions of this spectrum.
Every material on Earth's surface interacts with electromagnetic radiation differently. Healthy vegetation, for example, strongly reflects near-infrared light while absorbing red and blue visible light. Water absorbs most infrared radiation. Bare soil reflects moderately across visible wavelengths. These unique patterns of reflection and absorption are called spectral signatures.
Spectral signatures are what make remote sensing so powerful. By comparing the radiation detected by a sensor against known spectral signatures, scientists can identify land cover types, mineral compositions, water conditions, and more without ever visiting the site.
Passive and Active Remote Sensing Techniques
Passive sensors detect natural radiation that's either reflected sunlight or thermal energy emitted by Earth's surface. Multispectral and hyperspectral sensors fall into this category. Because they depend on external illumination, passive sensors generally only work during daytime (for reflected sunlight) or measure thermal emissions.
Active sensors generate their own energy pulse, send it toward a target, and then measure what bounces back. Radar and lidar are the two main types. Radar sends microwave pulses and is especially useful because microwaves penetrate clouds and work in darkness. Lidar sends laser pulses and excels at measuring elevation and vegetation structure.
The key distinction: passive sensors are like your eyes (they need a light source), while active sensors are like a flashlight (they provide their own).
Sensor Characteristics
Spatial, Temporal, and Spectral Resolutions
Three types of resolution define what a sensor can detect and how useful its data will be for a given application.
- Spatial resolution is the size of the smallest feature a sensor can distinguish, measured as the area each pixel covers on the ground. The WorldView-3 satellite achieves 0.31 m resolution, meaning each pixel represents a patch of ground roughly 31 cm across. That's enough to identify individual trees or cars. Coarser-resolution sensors like MODIS (250 m to 1 km) cover much larger areas per pixel and are better suited for regional or global studies.
- Temporal resolution is how frequently a sensor revisits the same location. Geostationary weather satellites image the same area every 10–15 minutes, which is critical for tracking storms. Landsat satellites revisit every 16 days, which works well for slower processes like seasonal vegetation change. There's often a tradeoff: satellites with very high spatial resolution tend to have lower temporal resolution, and vice versa.
- Spectral resolution describes how many wavelength bands a sensor records and how narrow those bands are. A standard multispectral sensor might capture 4–10 broad bands. The Hyperion hyperspectral sensor recorded 220 narrow bands, allowing much finer discrimination between materials that look similar in just a few bands. Higher spectral resolution helps distinguish, say, different crop types or mineral compositions that broader bands would lump together.
Remote Sensing Applications
Land Cover Mapping and Change Detection
Remote sensing is one of the primary tools for mapping what covers Earth's surface and tracking how it changes over time.
Image classification assigns each pixel in a satellite image to a land cover category based on its spectral signature. Two main approaches exist:
- Supervised classification requires the analyst to provide training samples (areas of known land cover) so the algorithm learns what each class looks like spectrally, then classifies the rest of the image.
- Unsupervised classification groups pixels into clusters based on spectral similarity without prior training data. The analyst then identifies what each cluster represents.
These techniques produce land cover maps showing the distribution of forests, urban areas, agricultural fields, water bodies, and other surface types.
Change detection compares images of the same area taken at different times to identify where and how land cover has shifted. This is used to monitor deforestation rates, track urban expansion into surrounding landscapes, and assess damage from natural disasters like wildfires and floods. For example, comparing pre- and post-fire satellite imagery lets scientists rapidly map burn extent and severity.
Ocean Color and Atmospheric Composition Monitoring
Ocean color remote sensing uses visible and near-infrared wavelengths to measure concentrations of phytoplankton (via chlorophyll), suspended sediments, and dissolved organic matter in surface waters. Phytoplankton concentration is a key indicator of ocean productivity. This data helps scientists track harmful algal blooms, assess coral reef health, monitor coastal water quality, and study how climate change affects marine ecosystems.
Atmospheric composition monitoring uses remote sensing to measure the concentration and distribution of gases and aerosols in the atmosphere. NASA's Aura satellite, for instance, measures , , and concentrations across the globe. These measurements are essential for studying air quality, tracking ozone layer recovery, and quantifying greenhouse gas emissions that drive climate change.