Replacement strategies for missing velocities

From Atomix


Quality-control of raw velocities results in data loss, which usually must be replaced before computing the spectra necessary for obtaining [math]\displaystyle{ \varepsilon }[/math]. The number of missing samples that can be tolerated for computing reliable spectra was also investigated.

Data analysis tests

Techniques considered for replacing the missing samples

  • Linear interpolation
  • Using the variance of the signal, which is commonly used by those intending to compute eddy-covariances
  • Unevenly spaced least-square Fourier transform (i.e., no replacement at all)
Example velocity time series where we randomly removed data in varying length gaps. Only the example of 1min (480 samples at 8Hz sampling) are illustrated. Removing chunks of 8 continuous samples (1s) looks identical to the original time series and is not illustrated.

Replacement strategy for tests

These replacement strategies were trialed with one of the cleanest benchmarks (Underice MAVS sampling at 8 Hz) for different

  • Number of missing samples to identify a threshold where the segment should be completely discarded from further analysis
    • 10, 25, and 50% of the 30 min timeseries were removed
  • Data loss (gap) duration
    • 1 sample
    • 8 samples (1 s)
    • 480 samples (60 s)
    • 960 samples (120 s)


A total section of 30 min was chosen as this coincides with the required segment length for deriving an estimate of [math]\displaystyle{ \varepsilon }[/math].

Test Results

For all tests, the linear interpolation did the best job in recovering the original spectra, followed by the unevenly spaced techniques. However, when the data loss was intermittent, the unevenly spaced Fourier transforms behaved similarly (if not worse) than the variance replacement. Unsurprisingly, unevenly spaced techniques fair better if the data loss creates long continuous gaps.

Spectral estimated from the data presented above after using the different replacement strategies: unevenly spaced least-square Fourier transforms (uneven), the variance of the signal (var), and linear interpolation (interp). The original time series refers to the data before removing (randomly) samples. For a given data loss (e.g., 10%), the original spectra were easier to recover with longer continuous gaps (d vs a)

Recommendations

  • Use linear interpolation, and record the percent of good samples in each segment in the NetCDF Level 3 data.
  • Reject and flag [math]\displaystyle{ \varepsilon }[/math] associated with segments with more than 10% data loss.
  • Record the threshold used for flagging and report it in the NetCDF file at Level 4.
    • The rejection threshold may be relaxed if the data loss forms long continuous chunks of several seconds, after testing the actual time series used to establish if the spectra can tolerate more missing samples. Recording the threshold used at Level 4 would permit others to exclude these data at a later time.

Return to Preparing quality-controlled velocities