<img src="https://d5nxst8fruw4z.cloudfront.net/atrk.gif?account=vWPao1IW1810N8" style="display:none" height="1" width="1" alt="">

The Aerial Perspective Blog

Understanding Errors and Distortion in Remote Sensing

Joe Sullivan  |  Jun 04, 2020  |  0 Comments
Understanding Errors and Distortion in Remote Sensing

The old saying “garbage in, garbage out” is true of many situations, and in remote sensing, it’s spot on. The insight you gain from a data set is only as good as the data you source.

Distortion in aerial photography or metadata inconsistencies can botch results, impacting project cost, deadlines, and even safety. When estimating vegetation growth for fire prevention or tracking hillside erosion in residential areas, for example, remote sensing accuracy can become a matter of life and death.

Accurate data from remote sensing technology is fuel for process automation, 3D immersive mapping, advanced security, and much more. But how do you detect errors and distortions that can undermine data integrity?

Understanding resolution

To understand how errors in remote sensing occur, we first need to understand what factors are at play in data generation.

Photogrammetry relies on high-resolution images to quantify measurements and form 3D models of buildings, infrastructure, property, and more. This aerial photography involves a number of different forms of data resolution that expand on our general understanding of picture quality. Resolution in photogrammetric data comes in three distinct forms:

Spatial Resolution

Spatial resolution describes the area and detail of the smallest feature that can be detected by a remote sensing device. It’s usually described by a single value representing the length of one side of a square pixel. In other words, a spatial resolution of 100m means that one pixel represents an area of 1,000 sq/meters on the ground.

The finer the spatial resolution, the more precise the data in each pixel — and the more refined your spatial analysis can be.

Spectral Resolution

Light and color can be incredibly valuable indicators of on-the-ground conditions, or infuriating sources of errors and distortions. It all depends on how accurately they’re documented.

Spectral resolution describes the capacity of a sensor to document electromagnetic wavelengths (which could include color, infrared light, and more). The finer the spectral resolution, the narrower the range that a sensor can document.

Color and shadow matter for most projects, but this factor is especially important for satellite-mounted technology — extra layers of atmosphere can become sources of geometric distortions in remote sensing.

Temporal resolution

When too much time passes between two data sets, the continuity necessary to draw solid conclusions is lost. This loss of continuity is the effect of poor temporal resolution.

Similarly, older data sets may no longer meet requirements for accuracy and precision in spatial or spectral resolution, creating inconsistencies and clouding results.

Thankfully, new UAV-mounted remote sensing devices make collecting data and improving temporal resolution on data sets affordable.

Avoiding errors in remote sensing

To reduce distortion and other accuracy issues, consider the following factors:

Atmospheric conditions

Changes in the atmosphere, sun illumination, and viewing geometries during image capture can impact data accuracy, and result in distortions that can hinder automated information extraction and change detection processes. Humidity, water vapor, and light are common culprits for errors and distortion.

When atmospheric conditions change, reference points can be obscured or lost, which impacts efforts to create accurate measurements from images. Differences in light temperature can lead to color changes that distort data quality, and make for unsightly inconsistencies that ruin the magic of 3D maps.

Altitude and reflectance

Light collected at high elevation goes through a larger column of air before it reaches the sensor. The result is surface reflectance, a phenomena which can diminish color quality and detail in images.

The difference in reflectance near the surface and at top-of-atmosphere creates substantial changes in color, image resolution, and perspective that may need to be accounted for in normalization. Even on a small scale, variances in altitude between data sets should raise a red flag for cross-referencing and review.

Documented metadata for cross-referencing

Many data errors come from sources that are difficult to pinpoint — momentary glitches in connectivity, inconsistencies in light or other atmospheric distortions in remote sensing. And unfortunately, the sources of errors in GIS aren’t always immediately apparent.

Metadata is data that describes data—in this case, characteristics of the collected GIS data. In photogrammetry, metadata could include notes related to GPS location, focal length and resolution settings, altitude, time/date, atmospheric conditions, and more. Metadata should tell you who made the data, provide context for the data, and help determine if the data is appropriate for your project.

For researchers and engineers, metadata offers valuable insight into the conditions in which a data set was created and, in many cases, the overall value it creates for a project. Using data sets with incomplete or inconsistent metadata will likely cause erroneous results and should be avoided.

False accuracy is a problem. Good data practices involve regularly layering and cross-referencing data sets against existing data to pinpoint errors and ensure accuracy. Always check your metadata when cross-referencing.

Control over the flight path

Photogrammetry relies on the stability of several factors to produce accurate and precise results. Unfortunately, some of the biggest tools used in remote sensing are among the least reliable.

Airplanes and helicopters are traditionally used in aerial photography. However, both are susceptible to changes in weather and wind speed, not to mention human error — which makes them unreliable for generating bulletproof data sets for advanced mapping software.

Thankfully, UAV technology offers increased control over flight paths. Drones can also fill the temporal resolution gap by flying frequent tours for less than a single manned flight.

Conclusion

Leveraging a data processing solution that allows for up-to-date, reliable data is vital for project success.

Aerial’s new Mapware photogrammetry software generates bigger, better 3D maps in the cloud, so you can access them from anywhere. Whether you want to map a single building, a dozen cell towers, or an entire city, this software comes backed by a team with the expertise needed to create value for your project.

Aerial Applications Mapware