What makes optimum resolution so hard?
Optimum resolution: What exactly do we mean by this? Like most things in life, it depends.
Borrowing a definition and example from the AQA Science Glossary: Resolution is the smallest change in the quantity being measured (input) of a measuring instrument that gives a perceptible change in the reading. For example, a typical mercury thermometer will have a resolution of 1°C, but a typical digital thermometer will have a resolution of 0.1°C.
The question then becomes how fine a resolution is appropriate? If we are trying to find the location of a stadium on a city map display, we would need a sensor that resolves the area in kilometers—if we get too much closer or “finer,” we can’t see enough of the city for context. But if we zoom out too far, the city itself becomes a dot. In contrast, if we want to read the license plate on a car in the parking lot at the stadium, we need a much finer resolution—one in single centimeters.
So when we talk about optimum resolution, we need to understand that the best data for a given situation might not be the highest-resolution data. Oversampling is as bad as undersampling if we want to discern as much as we can from the observables.
Moreover, there are more factors at play. Every sensing modality has elements that need to be resolved—spatially (as described above), spectrally (e.g., more information is derived from a color picture than a black-and-white), and temporally.
Indeed, the appropriate temporal-resolution tradeoff for a given problem might be as complex as and harder to determine than spatial or spectral decisions. Sensors are becoming more precise—they see what we tell them to, whether in microseconds or years—and a system will take samples as often as we design it to. The key is understanding the rhythm (or lack thereof) of events of interest. Watching a hummingbird or tracking a maneuvering hypersonic vehicle requires sampling on a microsecond scale. Monitoring a construction site is best done weekly to perceive actual change. It might appear that oversampling—that is, looking VERY often—is a safe bet, but it comes at the cost of large volumes of identical samples that take up storage space, transmit slowly, and don’t really inform you of change.
As a provider of data on a global scale, Vricon is often faced with the problem of processing speed versus level of detail. The more precisely you want to view a spot, the more data is involved. This is one version of optimum resolution that shows up across the data community and shines the light on a trade that needs to be made for all modes of sensing.
The critical thing is to begin by deciding what you want to learn. Then, choose the spatial, spectral, and temporal resolutions that best fit your needs. Luckily, computing is making it easier to collect and analyze the right data—quickly and efficiently. We’re gaining experience in this daily as we build The Globe in 3D, georegister sensors to this foundation data, and help our clients learn what is most important to them.
👉 Join Barry and USC ICT’s Kyle McCullough for their webinar on Thursday, July 9, 2020, at 1 p.m. ET! Register here for Adventures in the 3rd dimension: Advanced creation and applications of real-world data in 3D.