Welcome to the new IOA website! Please reset your password to access your account.

Proceedings of the Institute of Acoustics

 

Interoperable image-based change detection

 

R. Klemm, ATLAS ELEKTRONIK GmbH, Bremen, Germany
J. Groen, ATLAS ELEKTRONIK GmbH, Bremen, Germany
H. Schmaljohann, Bundeswehr Technical Center for Ships and Naval Weapons, Maritime Technology and Research, Kiel, Germany

 

1 INTRODUCTION

 

Surveys to monitor important underwater areas have been revolutionised in the sense of area coverage and resolution due to growing access of authorities to autonomous underwater vehicles (AUVs) equipped with synthetic aperture sonar (SAS) sensors. One promising methodology is change detection on these high quality sensor data. For image-based change detection, SAS images of the same scene recorded at different times are co-registered and changes, such as newly deployed objects, can be detected. Image-based change detection has a clear advantage for compatible partner datasets compared to conventional detection performed on individual images. Various factors determine whether a partner dataset is compatible for processing, and thus successful image based change detection is possible. These factors include for example differences in planning parameters such as course differences, AUV track differences, temporal baseline, or different sonar systems. In operational situations, change detection with sonar data from different nations, and thus different systems may prove beneficial. There is a wide range of possible variations of different systems. For example, different AUVs equipped with different SAS systems using various processing chains can be employed for acquisition of SAS imagery. In order to identify a compatible partner dataset, knowing the limits for valid planning parameters is necessary. For interoperable operations it is essential to evaluate whether registration with SAS imagery from different systems is without loss or even possible at all.

 

In the past decade extensive research on image-based change detection for SAS imagery has demonstrated the potential of the method. First experiments with incoherent change detection for SAS were presented in1,2,3. In4, among others, a thorough analysis on coherent and incoherent change detection was published. In this work incoherent change detection has shown to be more robust against environmental conditions than coherent change detection. In5 incoherent change detection was also shown to be very robust to challenging environmental conditions.

 

In this paper Section 2 briefly describes the automated processing chain and the relevant parameters that play an important role. In Section 3 image data from different systems are registered and the results are presented. The image data used was recorded in a joint measurement campaign of WTD 71 and ATLAS ELEKTRONIK GmbH using the AUVs SeaOtter and SeaCat with the sonar sensors Vision SAS Mk1 and Vision SAS Mk2, respectively. By operating both of these systems, different frequencies, processing chains, antennas and vehicles can be analysed in the context of interoperable change detection. Exemplary results for various configurations are also presented in this section. To investigate the relationship between quality metrics such as the achievable degree of correlation and planning parameters such as the course difference, an empirical study for all available partner datasets is carried out in Section 4. Based on these results, the dependency of change detection success rate as a function of planning parameters is statistically evaluated. The aim of this paper is to demonstrate the interoperability of image-based change detection and to derive the limits of valid planning parameters for successful change detection.

 

2 IMAGE BASED CHANGE DETECTION

 

2.1 Incoherent change detection

 

In this paper the SAS imagery for change detection is processed without phase information, i.e. incoherent change detection is considered. The algorithmic steps are implemented in an automatic processing chain with five main modules as shown in Figure 1. The five individual steps are (1) pre processing, (2) image size adjustment, (3) registration, (4) difference image generation, and (5) object detection. The pre-processing includes a median image normaliser and an optional filter. The image size adjustment selects the valid areas of the image, resamples the image pairs to the same pixel size, and performs zero padding to ensure equal image size at the input of the image registration. Image registration is subdivided into global and fine registration for computational reasons. Global registration first determines a rigid rotational and translational morphing of the complete image by correlating the rotated base image with the partner image after which fine registration determines a patch-wise morphing field based on local 2D correlation. The size of these patches is 64 × 64 pixels and the overlap is 8 pixels in each direction. The resulting morphing field is then applied to the base image, which then geographically matches the partner image. From the co-registered pair a difference image is created using the difference operator (in dB), which is typically much more smooth than the individual images. The idea is that only changes appear in this difference image, as long as the registration is sufficiently accurate. Areas resulting from the prior zero padding are removed in the difference image. As a last processing step, an object detector based on template matching is applied directly to the difference image. The processing steps shown lead to the difference images and the corresponding resulting detections as output. Further details on these steps can be found in5.


 

Figure 1: Block diagram for the incoherent change detection processing chain.

 

2.2 Database parameters

 

In order to select partner missions and partner legs automatically, each mission and each leg is stored in a database. Before a change detection job is started, suitable partners can be identified, based on a database query. This database contains information about each leg like the individual start time, course, depth, altitude, ranges, survey area and mission related information like the centre frequency, and sensor type. Partner information of each partner leg that has an overlap area to another leg is stored in a separate partner table. Here information like time differences, altitude differences, or overlap quantities can be found. Two related features that are intuitively important to determine suitable partners are overlap and leg displacement. In order to capture these two features in clear metrics, the parameters line piece average distance and overlap percentage are introduced. The parameter (line piece average distance) gives information about the distance between two partner legs. It is calculated by

 

 

The distances d1, d2, d3 and d4 are calculated as depicted in Figure 2. In addition to this parameter, the line piece minimum distance dmin is given by

 

 

The overlap percentage 0 is calculated with respect to the union surface of the base and the partner leg. Additionally two measures of overlap are calculated. Overlap to base 0b describes the overlap percentage in relation to the image area of the base leg and overlap to partner 0b describes the overlap percentage in relation to the image area of the partner leg.

 

 

Figure 2: Sketch of the distances between two legs.

 

2.3 Quality parameters

 

During the registration of two images, certain quality measures are determined. In the global registration step, a global correlation measure is determined by correlating the two partner images. The base image is rotated in order to achieve the maximum global correlation. During the fine correlation, a median fine correlation measure is calculated. This measure is the median value of the correlation values of all registered patches. In this step also a spikiness measure is calculated by

 

 

with the shifts of each registered patch in along-track direction Δπ‘₯𝑖,𝑗 and in across-track direction Δ𝑦𝑖,𝑗. The matrices and are also called morphing field . The index 𝑖= 1, … , 𝑁π‘₯ is the along-track pixel index and 𝑗= 1, … , 𝑁𝑦 is the across-track pixel index. 𝑁π‘₯ and 𝑁𝑦 describe the size of the morphing fields. The morphing error rates are calculated by

 


The mean morphing error rate for along- and across-track is given by πœ€= (πœ€π‘₯ + πœ€π‘¦ )/2. Among others, these quality measures are used to identify successful image registration.

 

In addition to the registration quality metrics, the false alarm reduction ΔFA is calculated. Therefore, the number of detections in the difference image are compared to those in the source image (see processing step 5). The difference in number of detections is the false alarm reduction. If the false alarms indeed show a significant reduction, it means that change detection was successful and achieved its ultimate goal.

 

3 EXEMPLARY RESULTS FOR DIFFERENT SYSTEMS

 

In our study, we registered SAS imagery from different systems. Exemplary results show the registered images listed in Table 1. These examples contain images from different processing chains (ATLAS SAS Processing or WTD71 SAS Processing), centre frequencies (MF, HF, and VHF), SAS systems (Vision Mk1 and Vision Mk2) and vehicles (ATLAS SeaOtter and ATLAS SeaCat). The centre frequencies for HF and VHF correspond to twice and four times the MF centre frequency, respectively. Besides the system configurations, also the quality factors median fine correlation and mean error rate are listed. The “Proc” column lists which processing chain has been used to generate the SAS imagery.

 

The examples in Figure 3 to Figure 7 show that in principle incoherent change detection can be applied to SAS imagery, generated with different antennas, different processing chains, different vehicles and large time differences of approximately 3.5 years. The dynamic colour range of each image is 30 dB. However, Example 5 (Figure 7) does not guarantee success in general for such a time difference. Other influencing variables are the geographical position and the associated environmental parameters such as current, weather, water depth, and bottom type. In Figure 3, the sidelobes of a rather loud object can be seen in the registered image, which appear as shadows in the difference image. Other objects that are visible in the registered base image and the partner image vanish. Examples for operational use-case scenarios are shown in Figure 4 and Figure 5. Unwanted clutter disappears in the difference image while newly placed objects appear (Figure 4). Figure 6 and Figure 7 additionally show results for registered images, generated with different processing chains and with different frequency bands.

 

Table 1: List of configurations for the five examples, shown in Figure 3 - Figure 7.

 


 

Figure 3: Example 1: Left: registered image (Vision SAS Mk2, VHF); Middle: Static partner image (Vision SAS Mk1, HF); Right: Difference image.

 

 

Figure 4: Example 2: Left: registered image (Vision SAS Mk2, VHF); Middle: Static partner image (Vision SAS Mk1, HF); Right: Difference image.

 

 

Figure 5: Example 3: Left: registered image (Vision SAS Mk2, VHF); Middle: Static partner image (Vision SAS Mk1, HF); Right: Difference image.


 

Figure 6: Example 4: Left: registered image (Vision SAS Mk1, MF, WTD71); Middle: Static partner image (Vision SAS Mk1, HF, ATLAS); Right: Difference image.

 

 

Figure 7: Example 5: Left: registered image (Vision SAS Mk1, HF, WTD71); Middle: Static partner image (Vision SAS Mk1, HF, ATLAS); Right: Difference image.

 

As listed in Table 1, for most cases, the mean error rate was below 20 %. However, in Example 1 and 2 the error rates are at 40.06 % and 26.32 % respectively. These error rates can occur due to environmental conditions such as the impact of internal waves or a strong swell in shallow water (8 m to 15 m deep) caused the image quality to deteriorate beyond certain bounds. For Example 5 the valid image range was deduced from the ping-to-ping correlation, which comes as a side product of the SAS processing.

 

4 STATISTICAL ANALYSIS ON SAS IMAGERY

 

ATLAS ELEKTRONIK GmbH and WTD 71 have conducted several measurement campaigns that included surveys with AUVs equipped with a SAS. The data acquired with the Vision SAS Mk1 and the SeaOtter AUV between 2013 and 2018 was included in the aforementioned database structure. SAS imagery for all of these data was generated. Note that these datasets were not explicitly acquired for change detection purposes. The automated change detection processing was carried out for all data sets for which the overlap criterion (π‘œp , π‘œb ) > 80 % is fulfilled. This criterion was introduced to avoid extreme processing times and unsuccessful registrations. Furthermore the processing time was reduced by using a reduced resolution of 5 cm ⨉ 5 cm. It should be noted that leg pairs that were planned in opposite course direction are also included in the analysis. Results with a global correlation equal to zero, or a median fine correlation equal to zero are neglected for this evaluation. In total 3661 samples are used for further statistical investigations. Successful change detection for these was identified by a false alarm reduction βˆ†FA ≥ 0 . With this definition, thresholds for successful processing were computed by maximizing 𝑃(𝐡 ∩ 𝐢) = 𝑃(π‘₯p ≥ 𝜈p ) ⋅𝑃(π‘₯p < 𝜈p ) , where the event 𝐡 corresponds to successful change detection with 𝐡 = {π‘₯p  ≥ 𝜈p } . Here π‘₯p is a realization of the current parameter and 𝜈 p is the threshold for this particular parameter. One can for example consider the probability of successful change detection for a course difference beyond a certain threshold. In this case change detection is more likely to fail for a higher course difference threshold, and therefore 𝑃(𝐡) will decrease and 𝑃(𝐢) will increase. The event 𝐢 corresponds to unsuccessful change detection with 𝐢 = {π‘₯p < 𝜈p } . The probability for event 𝐡 is estimated by  where pestimated density function for a successful change detection of the random variable 𝑋p. The probability for event 𝐢 is estimated by where is the estimated density function for an unsuccessful change detection.

 

 

Figure 8: Probability density function estimates of the median fine correlation for successful and unsuccessful change detection results.

 

The probability density function (PDF) directly resulting from the measurements, shown in Figure 8 consists of 1815 samples with βˆ†FA≥ 0 and 1846 samples with βˆ†FA< 0. The optimal threshold to divide the two PDFs for the median fine correlation is 𝜈mfc = 0.14. The mean morphing error rate is analysed in the same manner. Here the optimal threshold is 𝜈mer = 49.32. With those thresholds, the dataset is divided in successful runs and unsuccessful runs and PDFs for the database metrics are estimated.

 

Table 2: List of the estimated thresholds for compatible partner missions.

 


Table 2 shows the results for the analysed parameters. The criterion value max(𝑃(𝐡)𝑃(𝐢)) for the time difference the minimum line piece distance and the absolute altitude difference is low in comparison to the remaining values. That means that these parameters do not appear to have the discriminative ability to predict a successful or unsuccessful change detection. With these parameters and the given database, no robust estimate for the limits of these values can be made. As shown in Figure 7, incoherent change detection is also possible for datasets with a time difference of approximately 3.5 years. For a more accurate analysis of the parameter time difference, more data would be required. The criterion is at 0.59 for both the average line piece distance and the absolute course difference. Here, a better separation between compatible and incompatible datasets can be made.

 

In addition to these values, Figure 9 shows the success rate in percent for selected parameters, for data sets that also have a maximum absolute altitude difference below 2.8 m. This limit was chosen since there was no successful registration for an altitude difference above this value. Here, the success rate is visualised versus the line piece average distance and the absolute course difference, which are the two parameters that discriminate best. For an absolute course difference lower than 10° and a line piece average distance lower than 25 m, a success rate of 70.2 % can be expected. In total, this is based on the number of successful change detection results (1365) and the number of failures (579) when these criteria on the database parameters are met. The lower the required deviations, the lower the number of available samples becomes. If a line piece average distance of 6.9 m and an absolute course difference lower than 4.2° is achieved, the success rate rises 80.1 %. Here, in total 1070 successful change detection results and 266 samples unsuccessful change detection were identified.

 

 

Figure 9: Change detection success rate for all data sets with a maximum height difference below 2.8 m.

 

5 CONCLUSION

 

The results of this study show exemplarily that incoherent change detection is suitable for processing SAS data from different signal processing chains, with different frequency bands and from different systems. For data recorded with difficult environmental parameters and therefore lower image quality, change detection can be performed with degraded quality factors. In order to be able to register image data robustly against influences such as multipath propagation or vehicle instabilities, an automatic selection of the start and end range based of image quality measures is conceivable as an extension.

 

It is also possible to adjust the registration such that image quality variations are accounted for. For example, the coarse registration stage could select relevant image areas that are used for correlation, and excluding others.

 

In addition, the influence of differences in the AUV mission parameters on the probability of success of change detection was shown based on a statistical study of a large number of processed partner legs. It was shown that for a maximum line piece average distance of 6.9 m, a maximum absolute course difference of 4.23° and a maximum altitude difference of 2.8 m, a success rate of 80.1 % is achieved. In order to be able to show further limits of change detection, it would also be beneficial to record further data with different sensing geometries and in waters with more rocks or clutter. Suitable, realistic mine-like targets would enable a more concise evaluation by means of ROC curves.

 

6 REFERENCES

 

  1. D. D. Sternlicht, J. K. Harbaugh and M. A. Nelson, ‘Experiments in Coherent Change Detection for Synthetic Aperture Sonar‘, Proceedings of the IEEE/MTS OCEANS 2009 Conference, Mississippi, USA (2009).
  2. C. A. Matthews and D. D. Sternlicht, ‘Seabed Change Detection in Challenging Environments’, Proceedings of the SPIE: Detection and Sensing of Mines, Explosive Objects, and Obscured Targets XVI Volume 8017.
  3. Ø. Midtgaard, R.E. Hansen and T. O. Sæbø, ‘Change Detection using Synthetic Aperture Sonar: Preliminary Results from the Larvik Trial’, Proceedings of the IEEE/MTS OCEANS 2011 Conference, Spain (2011).
  4. V. Myers, ‘Processing, interpretation and exploitation of repeat-pass Synthetic Aperture Sonar data‘, PhD Thesis, ENSTA Bretagne, France (2019).
  5. C. Erdmann and J. Groen, ‘Image-based change detection to reduce false alarms in the Vision1200 synthetic aperture sonar‘, Proc. 32nd Undersea Defence Technology 2019, Stockholm, Sweden (2019).