Quality Control in Geophysical Data
Implementing Quality Control Procedures
Quality control (QC) in geophysical surveys isn't something you tack on at the end. It needs to be embedded in every stage, from data acquisition through processing, to catch problems before they compound.
During data acquisition, QC means actively monitoring instrument performance. This includes:
- Running calibration checks at regular intervals (e.g., beginning and end of each survey line, or at fixed time intervals)
- Monitoring ambient noise levels to flag periods when environmental interference is too high for reliable measurements
- Performing real-time data validation so the field crew can spot dropouts, spikes, or drift while they can still re-collect data
During data processing, QC shifts to verifying that corrections and filters are applied correctly:
- Apply appropriate filters for known noise sources (e.g., 50/60 Hz powerline filtering in EM surveys)
- Correct for known artifacts such as instrument drift, tidal effects (in gravity), or diurnal variations (in magnetics)
- Cross-validate processed results against independent datasets, borehole logs, or other ground truth information
Standardized QC protocols and checklists help keep these steps consistent across different crews and survey days. Regular communication between field crews and processing teams is critical so that acquisition problems get flagged and resolved quickly, rather than discovered weeks later.
Benefits of Robust Quality Control
- Early error detection prevents costly rework. A sensor drift caught on day one saves you from reprocessing (or re-acquiring) an entire survey
- Higher confidence in interpretations because you can demonstrate that known error sources have been addressed systematically
- Better data integration across surveys or projects, since consistent QC documentation makes it possible to compare datasets collected at different times or by different teams
- Reduced risk of misinterpretation, which matters especially when geophysical results inform engineering decisions, resource estimates, or hazard assessments
Sources of Error in Geophysics

Instrumental and Environmental Factors
Geophysical data always contain some degree of error and artifact. Recognizing where these come from is the first step toward managing them.
Instrumental noise includes electronic interference, sensor drift over time, and outright component malfunctions. For example, a magnetometer with a slowly drifting sensor will produce a gradual trend across your survey that has nothing to do with geology. Repeat base station readings help you identify and correct for this.
Environmental factors are often the trickiest to control:
- Weather conditions (wind vibration on geophones, temperature effects on electronics)
- Surface topography creating geometric distortions in seismic or GPR data
- Cultural noise sources like power lines, buried pipelines, fences, or nearby vehicle traffic generating electromagnetic or vibrational interference
Survey design problems can also degrade data quality. Inadequate spatial sampling leads to aliasing, where features in the subsurface appear at incorrect positions or wavelengths. Poor sensor-ground coupling (e.g., a geophone not firmly planted) reduces signal fidelity. Incorrect instrument settings, like choosing the wrong gain or sampling rate, can clip signals or miss important frequencies entirely.
Processing and Interpretation Challenges
Processing itself can introduce artifacts if you're not careful:
- Over-aggressive filtering can remove real geological signal along with the noise
- Incorrect assumptions about subsurface properties (e.g., assuming a constant velocity model in seismic processing when velocities vary laterally) produce distorted images
- Numerical instabilities in inversion algorithms can generate features in your model that don't correspond to anything real
The key habit is to document every potential error source and artifact you've identified, along with what you did (or couldn't do) to mitigate it. This documentation is essential for anyone interpreting the final results. When you encounter complex or ambiguous artifacts, consulting experienced geophysicists can save significant time and prevent misinterpretation.
Data Management for Geophysical Surveys

Organizing and Storing Geophysical Data
A single geophysical survey can generate gigabytes of raw data, processed volumes, and derivative products. Without a clear organizational system, finding and using that data becomes increasingly difficult over time.
File naming and directory structure should follow a standardized convention established before the survey begins. A typical approach encodes the survey name, date, line number, and data type into the filename (e.g., SurveyA_20240315_Line012_rawmag.csv). Consistent naming makes automated processing scripts possible and prevents confusion when multiple people access the same dataset.
Centralized data repositories or database systems provide:
- Secure storage with access controls (so only authorized users can modify raw data)
- Version control to track what changed, when, and by whom
- Efficient search and retrieval across large datasets
Metadata documentation is just as important as the data itself. At minimum, record survey parameters (line spacing, station interval, coordinate system), instrument specifications (model, serial number, firmware version), processing steps applied, and data quality indicators. Without this metadata, even perfectly collected data loses much of its value because future users can't assess its reliability or reproduce the processing.
Data Backup and Sharing Practices
Data loss from a hard drive failure or accidental deletion can be catastrophic if no backup exists. Follow the 3-2-1 rule as a baseline: keep at least 3 copies of your data, on 2 different storage media, with 1 copy stored off-site or in the cloud.
Additional best practices:
- Run data validation checks at each stage of the data lifecycle (acquisition, transfer, processing, archiving) to confirm nothing was corrupted or lost
- Establish clear data sharing policies before the project starts, specifying who can access what data, under what conditions, and in what formats
- Use secure transfer methods (encrypted connections, checksums to verify file integrity) when sharing data with collaborators or stakeholders
- Follow any applicable data privacy or confidentiality requirements, particularly for commercial or government-funded surveys
Data Integrity and Reproducibility in Geophysics
Comprehensive Documentation Practices
Reproducibility means that someone else, given your raw data and documentation, could follow your workflow and arrive at the same results. This is a cornerstone of credible geophysical work.
To achieve this:
- Document field procedures in detail, including instrument settings, station coordinates, environmental conditions, and any deviations from the planned survey design
- Archive both raw and processed data in standardized, open formats (e.g., SEG-Y for seismic, GeoTIFF for gridded data) alongside all associated metadata
- Use version control systems (such as Git) for processing scripts and analysis code, so you can track changes and reproduce any specific version of your workflow
- Record all processing parameters, including filter settings, inversion parameters, and any manual edits made to the data
Transparency and Adherence to Standards
Every geophysical interpretation carries assumptions, limitations, and uncertainties. Documenting these openly is not a weakness; it's what allows others to use your results appropriately.
- State assumptions explicitly (e.g., "a uniform half-space resistivity of 100 was assumed for the initial model")
- Note known limitations, such as areas of poor data coverage or frequency bands contaminated by noise
- Adhere to community data standards (e.g., those published by SEG, IAGA, or relevant national geological surveys) to ensure your data can be integrated with other studies
- Participate in community efforts to develop and refine QC guidelines and data management best practices, since these standards evolve as instrumentation and methods advance
Proper documentation and archiving don't just protect your own work. They enable future researchers to build on your results, perform meta-analyses across multiple surveys, and ultimately advance the field's collective understanding of subsurface processes.