Skip to main content

Spatial Resolution

 

A spectral remote sensing sensor detects the reflected radiation of the Earth's surface. A remote sensing sensor detects the reflected radiation of the Earth's surface and stores it as numbers in a raster. In accordance, each area that has been detected constitutes a cell in a raster. These raster cells are called pixels. The size of an area represented in a pixel depends on the capability of the sensor to detect details.

 

Verschiedene räumliche Auflösungen im Vergleich

 

Raster with low and high spatial resolution.

 

Low and high spatial resolution

The ability of a remote sensing sensor to detect details is referred to as spatial resolution. The spatial resolution is stated in metres. The more pixels are included in a remote sensing image of a certain area, the higher the spatial resolution meaning the more details can be observed.

The swipe below shops two satellite images of Bonn. You can clearly distinguish between a higher spatial resolution of 30 metres and a lower spatial resolution of 300 metres. In the image with the lower resolution, much more different objects must be included in one pixel.

 

 

Satellite images of Bonn with a spatial resolution of 30 metres and 300 metres respectively (© USGS/NASA Landsat Program).


 

Mixed Pixels

In almost every satellite image, objects that are close together must be included in one pixel. Such pixels are called mixed pixels. The image below shows a house and a garden included in the same pixel. Due to the low spatial resolution, the colour components of both objects (brown and green) result in a brown-green mixed pixel, which is very hard to analyse. The lower the spatial resolution, the more mixed pixels and the harder it is to tell areas apart.

 

Mischpixel


Formation of a mixed pixel caused by different objects in the same raster cell.

 

Why do only some sensors have a high spatial resolution?

We can ask the question: Why is it that not every sensor in spectral remote sensing has a very high resolution? The answer becomes clearer looking at the purpose of satellite-based remote sensing systems: If the same sensor is attached to an airplane and a satellite, the airborne sensor will have a very high resolution of e.g. 1 m, whereas the satellite-based sensor will have a low resolution of e.g. 30 m. At the same time, the satellite-based sensor detects a wider area in one single image and circles the Earth completely in only a few days. This is impossible for aircrafts! 

 

The spatial characteristics of spectral sensors are determined by the ratio of extend and resolution. If a maximized extend is required in order to depict as great an area as possible, we have to lower our sight regarding resolution because we cannot store all these data in a sensor (fig.).

 

Vier Bilder, vier Aufnahmesysteme , vier verschiedene räumliche Auflösungen


The images show four views of the same area as detected by four different sensors. The images all have the same number of pixels; but the spatial resolution of the different sensors cause different pixel sizes. Thus, in one case only one building is depicted (aerial image, 0,1 m, © ATKIS), another image shows the district of Poppelsdorf (QuickBird, 1 m, © DigitalGlobe), the third image covers the city of Bonn (Landsat, 30 m, © USGS/NASA Landsat Program) and the last image depicts a large area from Ruhrgebiet to Koblenz (MODIS, 300 m, © USGS/NASA MODIS Project)

 


Conclusion:

Each remote sensing sensor produces raster image data. In turn, each raster consists of raster cells, which are also referred to as pixels. The bigger a pixel, the more objects on the surface of the earth are captured and the lower the spatial resolution of a raster image. The higher the spatial resolution, the smaller the amount of hard-to-analyse mixed pixels included in a raster image.