The is crucial for geospatial engineering, encompassing all frequencies of electromagnetic radiation. Understanding its properties and interactions is essential for remote sensing applications, from to microwaves and .

Remote sensing principles involve active and passive sensors, various resolution types, and spectral signatures of features. These concepts form the foundation for capturing, analyzing, and interpreting Earth's surface data, enabling applications in , urban planning, and resource management.

Electromagnetic spectrum overview

  • The electromagnetic spectrum encompasses all frequencies of electromagnetic radiation, ranging from low- radio waves to high-frequency gamma rays
  • Understanding the properties and interactions of different regions of the electromagnetic spectrum is crucial for geospatial engineering applications, such as remote sensing and satellite imaging

Wavelength, frequency and energy

Top images from around the web for Wavelength, frequency and energy
Top images from around the web for Wavelength, frequency and energy
  • Electromagnetic radiation can be characterized by its , frequency, and energy
  • Wavelength is the distance between two consecutive crests or troughs of a wave, typically measured in meters or nanometers
  • Frequency refers to the number of wave cycles that pass a fixed point per unit of time, usually expressed in hertz (Hz)
  • Energy is inversely proportional to wavelength; shorter wavelengths have higher energy, while longer wavelengths have lower energy

Visible light spectrum

  • The visible light spectrum is the portion of the electromagnetic spectrum that is detectable by the human eye, ranging from approximately 380 to 700 nanometers
  • Visible light consists of colors ranging from violet (shortest wavelength) to red (longest wavelength)
  • Remote sensing applications often utilize the visible spectrum for capturing color imagery and analyzing surface features (vegetation, water bodies, urban areas)

Infrared and thermal radiation

  • has longer wavelengths than visible light, ranging from about 700 nanometers to 1 millimeter
  • Infrared radiation is divided into three regions: near-infrared (NIR), mid-infrared (MIR), and far-infrared (FIR)
  • Thermal radiation is a subset of infrared radiation emitted by objects due to their temperature
  • Infrared and thermal remote sensing are used for applications such as vegetation health monitoring, heat loss detection, and wildfire mapping

Microwave and radio waves

  • Microwaves and radio waves have the longest wavelengths in the electromagnetic spectrum, ranging from about 1 millimeter to several kilometers
  • These wavelengths are used in radar remote sensing, which can penetrate clouds, vegetation, and soil to some extent
  • Applications include soil moisture estimation, ocean surface monitoring, and terrain mapping using

Remote sensing principles

Active vs passive sensors

  • Remote sensing systems can be classified as either active or passive sensors
  • Active sensors emit their own energy and measure the returned signal, examples include radar and lidar systems
  • Passive sensors detect naturally available energy, such as reflected sunlight or emitted thermal radiation, examples include multispectral and hyperspectral sensors

Resolution types in remote sensing

  • refers to the smallest distinguishable feature in an image, determined by the sensor's instantaneous field of view (IFOV)
  • is the number and width of spectral bands a sensor can capture, higher spectral resolution enables better discrimination of surface features
  • is the revisit time of a sensor over the same area, important for monitoring changes over time
  • is the sensor's ability to distinguish between small differences in energy, higher radiometric resolution provides more detailed information

Spectral signatures of features

  • Different surface features (vegetation, water, soil, urban areas) have unique spectral patterns across the electromagnetic spectrum
  • These patterns, known as spectral signatures, allow for the identification and classification of features in remotely sensed imagery
  • Factors influencing spectral signatures include the physical and chemical properties of the feature, as well as environmental conditions (illumination, atmospheric effects)

Atmospheric effects on remote sensing

  • The atmosphere can significantly influence the energy reaching the sensor, leading to distortions in the recorded data
  • Atmospheric effects include absorption, scattering, and refraction of electromagnetic radiation
  • methods are applied to remove or minimize these effects and improve the accuracy of remote sensing data

Interaction of EMR with earth's surface

Absorption, transmission and reflection

  • When electromagnetic radiation (EMR) interacts with the Earth's surface, it can be absorbed, transmitted, or reflected
  • Absorption occurs when the energy is taken in by the surface material and converted into heat
  • Transmission happens when the EMR passes through the material without significant attenuation
  • Reflection is the process by which EMR is redirected back from the surface, the amount and direction of reflection depend on the surface properties

Factors affecting surface reflectance

  • Surface roughness: Smooth surfaces tend to have specular reflection (mirror-like), while rough surfaces exhibit diffuse reflection (scattered in various directions)
  • Moisture content: Wet surfaces generally have lower reflectance than dry surfaces, especially in the near-infrared and mid-infrared regions
  • Viewing and illumination geometry: The angle between the sun, the surface, and the sensor affects the observed reflectance
  • Wavelength: Surface features have different reflectance characteristics at different wavelengths, leading to unique spectral signatures

Spectral reflectance curves

  • Spectral reflectance curves represent the reflectance of a surface feature across different wavelengths
  • These curves are used to identify and distinguish different surface types based on their unique reflectance patterns
  • Examples of distinctive spectral reflectance curves include:
    • Vegetation: High reflectance in the near-infrared, low reflectance in the red (due to chlorophyll absorption), and a sharp increase in reflectance from red to near-infrared (known as the "red edge")
    • Water: Low reflectance in the near-infrared and beyond, with varying reflectance in the visible spectrum depending on water depth, clarity, and suspended sediments
    • Soil: Reflectance varies with soil type, moisture content, and organic matter, generally increasing with wavelength in the visible and near-infrared regions

Multispectral and hyperspectral sensing

Multispectral sensor characteristics

  • capture data in several distinct spectral bands, typically ranging from 3 to 15 bands
  • These bands are strategically selected to cover key regions of the electromagnetic spectrum for differentiating surface features
  • Examples of multispectral sensors include Landsat OLI (Operational Land Imager) and Sentinel-2 MSI (MultiSpectral Instrument)
  • Multispectral data is widely used for , vegetation monitoring, and urban planning

Hyperspectral imaging concepts

  • Hyperspectral sensors capture data in numerous narrow, contiguous spectral bands (often more than 100 bands)
  • This high spectral resolution allows for the detection of subtle differences in surface features and materials
  • Hyperspectral data is often represented as a data cube, with two spatial dimensions and one spectral dimension
  • Processing hyperspectral data involves techniques such as dimensionality reduction (PCA, MNF), spectral unmixing, and target detection

Applications of hyperspectral data

  • Mineral exploration: Identifying specific mineral compositions based on their unique spectral signatures
  • Precision agriculture: Monitoring crop health, nutrient deficiencies, and water stress at a detailed level
  • Environmental monitoring: Detecting and mapping pollutants, oil spills, and invasive species
  • Military and defense: Identifying camouflaged targets and assessing battlefield conditions

Microwave and radar remote sensing

Radar basics and functionality

  • Radar (Radio Detection and Ranging) is an active remote sensing technique that uses frequencies to detect and measure target properties
  • A radar system emits a microwave pulse and records the backscattered signal from the target surface
  • The strength and time delay of the returned signal provide information about the target's distance, size, and surface characteristics
  • Radar can penetrate clouds, vegetation, and soil to some extent, making it useful for all-weather and day-night imaging

Synthetic Aperture Radar (SAR)

  • SAR is a specialized radar technique that uses the motion of the sensor platform to simulate a larger antenna, resulting in higher spatial resolution
  • SAR systems emit a series of microwave pulses and record the backscattered signals along the flight path
  • The recorded signals are processed using complex algorithms to generate high-resolution imagery of the target area
  • SAR data is used for applications such as terrain mapping, surface deformation monitoring, and ocean surface imaging

Interferometric SAR (InSAR) techniques

  • InSAR involves comparing two or more SAR images of the same area taken at different times to detect surface changes or deformations
  • By analyzing the phase differences between the SAR images, InSAR can measure small-scale ground displacements (millimeter to centimeter level)
  • InSAR is used for monitoring geophysical processes such as earthquakes, volcanic activity, landslides, and subsidence
  • Advanced InSAR techniques, such as Persistent Scatterer Interferometry (PSI) and Small Baseline Subset (SBAS), can improve the accuracy and reliability of deformation measurements

Thermal remote sensing

Thermal infrared region characteristics

  • The thermal infrared region of the electromagnetic spectrum ranges from approximately 8 to 14 micrometers
  • Objects emit thermal infrared radiation based on their temperature and emissivity (the ability to emit energy relative to a perfect blackbody)
  • Thermal remote sensing captures the emitted thermal infrared radiation to determine surface temperature and thermal properties
  • Factors affecting thermal infrared measurements include atmospheric conditions, surface emissivity, and sensor characteristics

Emissivity and Kirchhoff's law

  • Emissivity is a measure of a material's ability to emit thermal infrared radiation compared to a perfect blackbody (which has an emissivity of 1)
  • Kirchhoff's law states that, for a given wavelength and temperature, the emissivity of a surface equals its absorptivity (the fraction of incident energy absorbed)
  • Materials with high emissivity (water, vegetation) appear cooler in thermal infrared imagery, while materials with low emissivity (metals, bare soil) appear warmer
  • Accounting for emissivity variations is essential for accurate temperature retrieval and thermal infrared image interpretation

Thermal sensor types and applications

  • Thermal sensors can be classified as either uncooled or cooled, based on their detector technology
  • Uncooled thermal sensors (microbolometers) are more compact and affordable but have lower sensitivity and resolution compared to cooled sensors
  • Cooled thermal sensors (HgCdTe, InSb) offer higher sensitivity and resolution but require cryogenic cooling, making them more expensive and complex
  • Applications of thermal remote sensing include:
    • Urban heat island mapping and energy efficiency studies
    • Wildfire detection and monitoring
    • Volcanic activity and geothermal exploration
    • Agricultural water stress and irrigation management

Remote sensor platforms

Airborne and UAV-based sensors

  • Airborne remote sensing involves mounting sensors on aircraft (planes, helicopters) to collect high-resolution data over specific areas
  • Unmanned Aerial Vehicles (UAVs) or drones have become increasingly popular for remote sensing applications due to their flexibility, low cost, and high spatial resolution
  • Airborne and UAV-based sensors can include multispectral, hyperspectral, thermal, and lidar systems
  • These platforms are useful for small-scale, high-resolution mapping, emergency response, and precision agriculture

Satellite remote sensing systems

  • Satellite remote sensing involves placing sensors on orbiting satellites to collect data over large areas of the Earth's surface
  • Satellites can be classified based on their orbit type (geostationary, polar, sun-synchronous) and altitude (low, medium, high)
  • Examples of satellite remote sensing systems include:
    • Landsat series (Multispectral)
    • Sentinel series (Multispectral, SAR)
    • MODIS (Moderate Resolution Imaging Spectroradiometer)
    • WorldView series (High-resolution multispectral)
  • Satellite remote sensing is used for global monitoring, climate change studies, and large-scale mapping applications

Comparison of sensor platforms

  • Airborne and UAV-based sensors offer higher spatial resolution and flexibility compared to satellite systems but have limited coverage and higher operational costs
  • Satellite systems provide consistent, large-scale coverage and long-term data continuity but have lower spatial resolution and revisit times compared to airborne and UAV-based sensors
  • The choice of sensor platform depends on the specific application, required spatial and temporal resolution, and available resources
  • Integrating data from multiple platforms (satellite, airborne, UAV) can provide a more comprehensive understanding of the Earth's surface and processes

Remote sensing data processing

Radiometric and atmospheric corrections

  • Radiometric corrections aim to convert the raw digital numbers (DNs) recorded by the sensor into physically meaningful units (radiance or reflectance)
  • Radiometric corrections account for , sun angle, and topographic effects to ensure consistent and comparable measurements across different scenes and time periods
  • Atmospheric corrections remove or minimize the effects of atmospheric absorption and scattering on the recorded signal
  • Common atmospheric correction methods include:
    • Dark Object Subtraction (DOS)
    • Fast Line-of-sight Atmospheric Analysis of Hypercubes (FLAASH)
    • Second Simulation of a Satellite Signal in the Solar Spectrum (6S)

Geometric corrections and orthorectification

  • Geometric corrections address distortions in the remotely sensed imagery caused by sensor characteristics, platform motion, and Earth's curvature and rotation
  • These corrections involve transforming the image coordinates to a standard map projection and coordinate system
  • Orthorectification is a specific type of geometric correction that removes terrain-induced distortions using a
  • Orthorectified imagery has a constant scale and can be used for accurate measurements and mapping applications

Image enhancement techniques

  • techniques improve the visual interpretation and analysis of remotely sensed data
  • Common image enhancement techniques include:
    • : Adjusting the range of pixel values to improve contrast and highlight features of interest
    • : Redistributing pixel values to achieve a more balanced distribution and enhance local contrast
    • Spatial filtering: Applying filters (low-pass, high-pass, edge detection) to emphasize or suppress specific spatial frequencies and features
    • Band ratios and indices: Combining spectral bands to highlight specific surface properties (vegetation indices, mineral indices, water indices)
  • Image enhancement techniques are used to prepare data for visual interpretation, feature extraction, and classification tasks

Remote sensing data interpretation

Elements of visual image interpretation

  • Visual image interpretation involves analyzing remotely sensed imagery based on key elements such as:
    • Tone or color: The brightness or hue of features in the image
    • Texture: The spatial arrangement and variation of tones or colors
    • Pattern: The repetitive arrangement of features or tones
    • Shape: The form or outline of individual features
    • Size: The relative dimensions of features
    • Shadow: The dark areas cast by elevated features, providing information about height and structure
    • Association: The relationship between features and their surroundings
  • Interpreters use these elements in combination with their knowledge of the study area and the application domain to extract meaningful information from the imagery

Digital image classification methods

  • Digital image classification involves assigning pixels or groups of pixels to specific land cover or land use classes based on their spectral and/or spatial properties
  • Supervised classification methods require training data (known samples of each class) to develop a classification model, which is then applied to the entire image
  • Unsupervised classification methods cluster pixels based on their inherent spectral similarities without prior knowledge of the classes
  • Object-based image analysis (OBIA) considers the spatial context and relationships between pixels, segmenting the image into homogeneous objects before classification
  • Machine learning algorithms (Random Forests, Support Vector Machines, Deep Learning) have become increasingly popular for image classification due to their ability to handle complex data and improve accuracy

Accuracy assessment of classifications

  • evaluates the quality and reliability of the classified image by comparing it to reference data (ground truth)
  • An error matrix (confusion matrix) is used to summarize the agreement between the classified image and the reference data
  • Key accuracy metrics derived from the error matrix include:
    • Overall accuracy: The proportion of correctly classified pixels across all classes
    • Producer's accuracy: The proportion of reference pixels correctly classified for each class (a measure of omission error)
    • User's accuracy: The proportion of classified pixels that actually belong to each class (a measure of commission error)
    • Kappa coefficient: A measure of agreement between the classified image and the reference data, accounting for chance agreement
  • Accuracy assessment helps users understand the strengths and limitations of the classified image and guides improvements in the classification process

Key Terms to Review (33)

Accuracy assessment: Accuracy assessment is the process of evaluating the correctness of spatial data and its alignment with reality. This evaluation is critical for determining how well the data represents the phenomena it aims to depict, especially in remote sensing where data derived from sensors must be validated against real-world conditions to ensure reliability.
Active Sensing: Active sensing refers to a remote sensing technique where sensors emit their own energy, such as light or radio waves, and then measure the reflected signals from the target. This process allows for the collection of detailed information about an object or surface, as the emitted energy can be specifically controlled to optimize data quality. Active sensing plays a crucial role in various applications like radar, LiDAR, and sonar, providing a means to gather precise measurements regardless of natural light conditions.
Aerial photography: Aerial photography is the process of capturing images of the Earth's surface from an elevated position, typically using aircraft, drones, or satellites. This technique allows for the collection of high-resolution images that can reveal important information about land use, topography, and environmental changes, making it essential for various applications like mapping, surveying, and remote sensing.
Atmospheric correction: Atmospheric correction is the process of removing or reducing the effects of atmospheric interference on remotely sensed data to improve the accuracy and quality of the data. This correction is crucial because atmospheric constituents like gases, aerosols, and water vapor can distort the signals captured by sensors, leading to inaccurate interpretations of the Earth's surface features. By applying atmospheric correction techniques, it becomes possible to retrieve more reliable surface reflectance values that are essential for various applications in remote sensing.
Contrast Stretching: Contrast stretching is a technique used in image processing to enhance the contrast of an image by expanding the range of intensity values. This process helps to make features in an image more distinguishable, which is especially important in remote sensing, where images can often appear dull or washed out due to atmospheric effects or sensor limitations.
Data fusion: Data fusion is the process of integrating multiple data sources to produce more accurate, comprehensive, and actionable information. This technique allows for the combination of different types of data, such as images, sensor readings, and geographic information, which enhances the understanding of a particular phenomenon or area. By leveraging diverse data types and sources, data fusion improves the reliability of analyses and decision-making across various applications.
Digital Elevation Model (DEM): A Digital Elevation Model (DEM) is a 3D representation of terrain surface created from elevation data, typically displayed in a raster format. DEMs are essential for analyzing terrain features and play a critical role in various applications such as hydrology, urban planning, and environmental monitoring, leveraging remote sensing technologies to capture the Earth's surface data.
Electromagnetic Spectrum: The electromagnetic spectrum encompasses the range of all types of electromagnetic radiation, which varies in wavelength and frequency. This spectrum includes everything from radio waves, which have the longest wavelengths, to gamma rays, which have the shortest. Understanding the electromagnetic spectrum is crucial for various applications in remote sensing, as different wavelengths can penetrate the atmosphere differently and interact with materials on the Earth's surface in unique ways.
Environmental Monitoring: Environmental monitoring refers to the systematic collection and analysis of data related to environmental conditions to assess changes, impacts, and trends over time. This process involves using various technologies, including remote sensing and spatial data infrastructure, to observe and evaluate environmental parameters such as air quality, land use, and climate change effects.
Frequency: Frequency is defined as the number of occurrences of a repeating event per unit of time, often measured in hertz (Hz). In the context of electromagnetic waves, it relates to how many wave cycles pass a given point in one second, which directly influences properties such as energy and wavelength. Understanding frequency is crucial because it determines how different types of electromagnetic radiation interact with matter, impacting remote sensing applications and the information we can gather from various surfaces and atmospheres.
Georeferencing: Georeferencing is the process of aligning spatial data (like maps or images) to a known coordinate system so that it accurately represents real-world locations. This involves assigning geographic coordinates to each pixel in a raster image or linking points in vector data to their corresponding locations on the Earth's surface, which is crucial for effective spatial analysis and integration of various geospatial datasets.
Histogram Equalization: Histogram equalization is a technique used in image processing to enhance the contrast of an image by adjusting the intensity distribution of its pixels. This method works by transforming the histogram of pixel intensities to achieve a uniform distribution, allowing for better visibility of features in an image. It is particularly useful in remote sensing applications where images may suffer from poor lighting conditions or contrast issues, as it improves the interpretability of data captured from the electromagnetic spectrum.
Hyperspectral imaging: Hyperspectral imaging is a remote sensing technique that captures and processes information from across the electromagnetic spectrum, typically using hundreds of narrow, contiguous spectral bands. This allows for the detailed analysis of materials and objects by identifying their spectral signatures, which are unique patterns of light absorption and reflection. Hyperspectral imaging plays a significant role in various applications such as environmental monitoring, agriculture, and mineral exploration, providing more precise data compared to traditional imaging methods.
Image enhancement: Image enhancement refers to a set of techniques used to improve the visual appearance of images or to convert images into a form that is more suitable for analysis. This process is crucial in remote sensing, as it allows for the clearer interpretation of data captured from the electromagnetic spectrum, making it easier to extract meaningful information and identify features within the imagery.
Infrared radiation: Infrared radiation is a type of electromagnetic radiation with wavelengths longer than visible light but shorter than microwaves, typically ranging from about 700 nanometers to 1 millimeter. This form of radiation is significant in remote sensing as it plays a crucial role in thermal imaging and the analysis of surface temperatures, helping in understanding land cover and vegetation health.
Interferometric Synthetic Aperture Radar (InSAR): InSAR is a remote sensing technique that uses radar signals to generate precise measurements of the Earth's surface, capturing changes in elevation or deformation over time. By comparing two or more radar images of the same area taken at different times, InSAR can detect shifts in the landscape due to natural events like earthquakes or human activities such as subsidence. This technique is rooted in the principles of the electromagnetic spectrum, as it utilizes microwave signals which are part of the radar wavelengths.
Land cover classification: Land cover classification is the process of categorizing different types of surface materials present on the Earth's surface, such as forests, grasslands, urban areas, and water bodies. This classification is crucial for understanding environmental changes, land use planning, and resource management. By using remote sensing data, this classification helps in identifying patterns and trends in land use and land cover over time.
Microwave: Microwave refers to a specific range of electromagnetic waves with wavelengths typically ranging from 1 millimeter to 1 meter, which corresponds to frequencies between 300 megahertz (MHz) and 300 gigahertz (GHz). These waves are significant in remote sensing due to their ability to penetrate clouds, rain, and even foliage, making them useful for various applications such as radar systems, satellite communication, and environmental monitoring.
Multispectral sensors: Multispectral sensors are devices that capture data across multiple wavelengths of the electromagnetic spectrum, allowing for the analysis of various surface properties and conditions. By measuring light reflected or emitted from the Earth's surface in different spectral bands, these sensors provide valuable information for applications such as remote sensing and precision agriculture. They enable the identification and differentiation of materials based on their spectral signatures, which can reveal insights into vegetation health, soil conditions, and land use patterns.
Passive Sensing: Passive sensing refers to the technique of detecting natural radiation emitted or reflected by objects, rather than actively transmitting signals. This method relies on the existing energy sources, primarily sunlight, to capture data about the Earth's surface and atmosphere. Passive sensors measure this radiation to gather information on various features, which is crucial for remote sensing applications and understanding environmental conditions.
Radiometric Resolution: Radiometric resolution refers to the ability of a sensor to distinguish different levels of electromagnetic energy, essentially determining how finely it can measure the intensity of radiation. This concept is crucial in remote sensing because higher radiometric resolution allows for better differentiation between similar objects or surfaces based on their spectral signatures, leading to more accurate data analysis and interpretation.
Reflectance: Reflectance is the proportion of incident electromagnetic radiation that is reflected off a surface, often expressed as a percentage. This property is crucial in understanding how different materials interact with light, affecting the way we collect and interpret remote sensing data. Reflectance values can vary based on factors such as surface texture, color, and moisture content, making it essential for analyzing land cover types and vegetation health.
Satellite Imagery: Satellite imagery refers to the photographs and data collected by satellites orbiting Earth, which capture images of the planet's surface. This technology plays a vital role in various applications, including environmental monitoring, land use planning, and disaster management, by providing detailed visual information that can be analyzed for changes over time and across different regions.
Sensor calibration: Sensor calibration is the process of adjusting the output of a sensor to ensure its accuracy and reliability when measuring environmental variables. This procedure involves comparing the sensor's readings against known reference values to determine any discrepancies, which allows for corrections to be applied. Proper calibration is essential for ensuring that remote sensing instruments can accurately interpret data related to the electromagnetic spectrum and are effectively utilized in various applications.
Spatial resolution: Spatial resolution refers to the smallest discernible detail in an image or dataset, indicating how much spatial information is captured in a specific area. This concept is crucial as it affects the clarity and detail of imagery, impacting various applications such as monitoring environmental changes and modeling landscapes.
Spectral resolution: Spectral resolution refers to the ability of a sensor to distinguish between different wavelengths of light within the electromagnetic spectrum. This capability is crucial in remote sensing as it affects how detailed and specific the captured images are, allowing for the identification of materials and features on the Earth's surface. High spectral resolution enables the detection of subtle differences in spectral signatures, which is particularly important in applications such as vegetation analysis, mineral identification, and environmental monitoring.
Spectral signature: A spectral signature is a unique pattern of reflectance or emittance of electromagnetic energy from an object or surface across different wavelengths of the electromagnetic spectrum. This distinctive signature allows for the identification and characterization of materials, making it essential in remote sensing applications and image classification processes.
Synthetic aperture radar (SAR): Synthetic aperture radar (SAR) is a form of radar that uses motion to synthesize a large aperture, allowing for high-resolution imaging of landscapes and objects. This technology operates by emitting microwave signals towards the ground and then processing the reflected signals to create detailed images. SAR is particularly powerful because it can capture data during both day and night and under various weather conditions, making it an essential tool in remote sensing applications such as land cover mapping and precision agriculture.
Temporal resolution: Temporal resolution refers to the frequency at which data is collected over time, indicating how often observations are made in a specific area. This concept is crucial for understanding changes in the environment, as higher temporal resolution allows for more detailed tracking of changes and events, while lower temporal resolution may miss important variations. It directly influences the effectiveness of monitoring phenomena such as land use changes and natural disasters.
Thermal radiation: Thermal radiation is the emission of electromagnetic waves from all matter that has a temperature above absolute zero. This type of radiation is primarily in the infrared spectrum and is a key concept in understanding how heat energy is transferred, particularly in remote sensing applications where it helps to detect and measure surface temperatures from a distance.
Transmittance: Transmittance refers to the fraction of incident electromagnetic radiation that passes through a material without being absorbed or reflected. It plays a crucial role in understanding how different materials interact with various wavelengths of light, which is fundamental for interpreting remote sensing data and the behavior of the electromagnetic spectrum.
Visible Light: Visible light is the portion of the electromagnetic spectrum that can be detected by the human eye, typically ranging from wavelengths of about 380 to 750 nanometers. This spectrum plays a crucial role in remote sensing, as it allows for the observation and analysis of Earth's surface features and phenomena through various imaging techniques.
Wavelength: Wavelength is the distance between consecutive peaks (or troughs) of a wave, commonly measured in meters. It plays a crucial role in determining the energy and frequency of electromagnetic waves, impacting how these waves interact with matter. Wavelength is essential for understanding various applications in remote sensing, as it helps to define the type of information that can be captured from reflected or emitted electromagnetic radiation.
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.