A A A Volume : 45 Part : 1 Proceedings of the Institute of Acoustics Drone-borne SAR change detection techniques Ali Bekar, Department of EESE, University of Birmingham, UK Christopher J. Baker, Department of EESE, University of Birmingham, UK Michail Antoniou, Department of EESE, University of Birmingham, UK 1 INTRODUCTION Mini consumer drones have several unique characteristics such as being cheap, easily accessible, rapidly deployable, easy-to-operate, and able to access difficult areas where human access is dangerous and unsafe. In parallel, drone technology has been progressing and drone technical specifications have been improving such as increasing flight time, better positioning accuracy, and higher resistance to changing weather conditions. Hence, it has been used by the synthetic aperture radar (SAR) community as a new radar platform over the last decade. Previous research showed that high-frequency drone-borne SAR systems are capable to provide very fine-resolution imagery of an extended target area by employing data-driven autofocus techniques [1]. Having the capability to obtain high-resolution imaging, a natural next step is to investigate how such imagery can be exploited through advanced SAR imaging modes. Drone-borne SAR may employ the SAR change detection technique to detect defects and changes that occur over time and improve situational awareness. Change detection (CD) relies on the comparison of repeat-pass, temporally separated SAR images. It is a well-known SAR technique and widely used by the SAR community [2] – [7]. Both large-scale changes and subtle changes can be detected by CD methods. When higher operating frequencies are used, the sensitivity to subtle changes in the scene improves. However, precision SAR focusing and high coherence between repeat-pass images are preconditions for SAR CD. Fundamentally, the main challenge in CD is ensuring a high correlation between the image pair. Due to the short-range, high-frequency, and high-resolution application, the drone's motion deviation from the nominal trajectory results in space-variant errors besides the space-invariant errors. These errors are unique for each pass, which makes two-pass imaging geometries mismatched. Even if two focused images are formed, high co-registration errors and spatially invariant/variant phase errors can exist in the resulting change map. The previous study showed that calibrated target displacements can be detected by both the incoherent ratio method and the classical coherence estimator, and the SAR system operating at 24 GHz can potentially provide a mm-level sensitivity to the scene change [8]. However, only phase-ramp errors were compensated in the coherent change detection process. Hence, the algorithm in [8] can be used when motion errors are low. The aim of this study is to generate CD maps robustly by minimizing the effect of motion errors and to compare the performance of the different change detection approaches for high-frequency, high resolution drone-borne SAR. In this work, the proposed algorithm in [8] is modified to form CD maps more robustly. After examining the drone-borne SAR capability in detecting human activities and car tyre marks experimentally using the conventional change detection methods employed in [8], further improvement in the CD performance of the system is investigated by employing Berger's coherence estimator and combining coherent and incoherent methods. In summary, SAR images are formed using data-driven autofocus techniques as described in [1]. Then, geometric mismatches are minimized by employing an intensity-based co-registration algorithm. Later, an interferometric phase error compensation technique that uses a parametric optimization approach is implemented. Finally, incoherent/coherent change maps are formed using the above-mentioned methods, and the improvement in the detection performance is investigated when coherent and incoherent techniques are used together. The remainder of this paper is organized as follows. Section 2 explains SAR change detection whereas Section 3 explains the algorithm. The experimental results are presented in Section 4. Finally, in Section 5, conclusions are drawn. 2 SAR CHANGE DETECTION Change detection is a repeat-pass, repeat-geometry SAR method. The change can be detected by incoherent and coherent methods. Incoherent methods are more sensitive to large-scale changes, such as the displacement of a sizeable object whereas coherent methods can detect subtle changes, such as tyre marks and footprints. In incoherent change detection (ICD), temporally separated SAR images are compared in terms of mean backscatter power. Conventionally, averaging of multiple resolution cells’ intensities is taken to obtain the mean backscatter power. The change map can be generated by considering the difference between the intensities. However, this method is not befitting to detect changes in SAR imagery because the result changes depending on the intensity levels of the images [4]. As an alternative approach, the ratio method which is independent of intensity variance and only depends on the relative change can be used. The change is calculated by using a sliding window: where π and π are the first and the second complex images obtained from two passes, π₯ and π¦ represent ground-range and cross-range samples, π and π indicate resolution cells within the sliding window, and π and π determine the number of resolution cell samples along the ground-range and cross-range directions, respectively. The change can also be expressed in decibels as 10 log(πΆπ·1). In this case, high-intensity pixels represent arriving targets whereas low-intensity pixels indicate leaving targets. Coherent change detection (CCD) relies on the comparison of repeat-pass, temporally separated SAR images at the phase level and the subsequent extraction of an image coherence map. The pixels in the CCD map have values in the range [0, 1], and low coherence pixels show the change. There are various coherent change estimators [2] – [7]. The classical coherence estimator is given as [1]: where the symbol [β]∗ denotes the complex conjugate, whereas the other parameters and boundary conditions are the same as in (1). As an alternative approach, Berger’s coherence estimator [9] can improve the change detection capability [3]. This approach assumes that repeat-pass, temporally separated SAR images have equal variance and is given by: This approach is modified in [10] to improve the estimation accuracy in the case of a low clutter-to noise ratio (CNR). Also, in [7], a method is proposed to be used when the variance equality assumption is not met. Another approach for change detection is combining incoherent and coherent methods. In [3], a two stage change detection algorithm is proposed. In this approach, at first, an ICD map is generated and the pixels that show change is filtered by setting a threshold. Later, a CCD map is formed using the remaining pixels, and the pixels that show change are determined by thresholding. Finally, both results are combined, and the final map is formed with a better change detection performance. Also, ICD and CCD maps can be combined before thresholding. To do this, the ICD map is scaled so that all values fall in the range [0, 1], and the low-intensity pixels show the change. It is written as: where min ( . ) and max ( . ) give the minimum and maximum values in the data, respectively. Then, the final CD map, πΆπ·T, can be generated by multiplying the ICD and CCD maps after setting the maximum value of πΆπ·B to 1. 3 DRONE-BORNE SAR CHANGE DETECTION ALGORITHM The change detection algorithm includes three main modules; the image formation, incoherent change detection and coherent change detection modules. The flow chart of the algorithm is shown in Figure 1. In the image formation module, two-pass radar raw data are processed to form high resolution focused images. In drone-borne SAR, high deviations from the nominal trajectory that result in range walk errors and spatially variant/invariant phase errors are common, making the use of motion compensation (MoCo) techniques necessary. We use positional data-based MoCo, local quadratic map-drift (LQMD) and phase gradient algorithm (PGA) as described in [1]. Briefly, after positional data-based MoCo, the Doppler centroid is adjusted to zero, and strong targets are selected in order to perform autofocus. Then, the space invariant errors are estimated by LQMD and PGA, respectively. After this step, range cell migration correction (RCMC) is done, and the image is divided into azimuth and range blocks. Overlapped or non-overlapped local scenes can be created. Then, residual phase errors in each local scene are compensated by implementing PGA. In this work, azimuth compression is done for each local scene separately due to the fact that the proposed CCD algorithm employs local images to minimize both geometrical and phase errors between the primary image (master image) and the repeat-pass image (slave image). The process of forming images is identical for both master and slave images. The formed local images from the two-pass radar data are passed to the change detection modules. The first step in generating a change map is to co-register the images. This involves geometrically modifying the slave image so that it is aligned with the master image. We use an intensity-based image registration approach. This approach iteratively modifies the geometrical transformation matrix to maximize the similarity of intensity patterns between the images. Mutual information is used as a similarity metric [11]. First, the similarity of the slave image and the master image is calculated based on mutual information. Then, an image transformation matrix is formed using an evolutionary optimizer [12]. Figure 1: Change detection algorithm flow chart. The transformation matrix includes variables for different types of distortions, such as translation, rotation, scale, and shear. The similarity value is maximized by iteratively modifying the transformation matrix. The process ends and the registered slave image is generated when the algorithm reaches the maximum number of iterations or the desired similarity value. In this way, the geometrically aligned slave image is obtained. After completing the image co-registration, the incoherent change map is generated by comparing the backscatter powers of the master and slave images using equation (1), in the ICD module. By repeating these steps for other local images, the full ICD map can be formed. The CCD module uses the master and the slave complex local image data as its inputs. When motion deviations from the nominal trajectory exist, the phase difference of the SAR images may contain phase errors. Although most of the motion errors are compensated in the image formation stage, the residual errors may still degrade the change map. In most cases, the master image, and the co registered slave image, contain residual constant, linear, and non-linear phase errors due to imperfect MoCo. Hence, phase errors, ∅err , between the images should be compensated after the image co registration. ∅err can be expressed as a polynomial of some order: where π€π (π = 0, 1, β― ,5) are the coefficients to be determined, and xΜ and yΜ are ground-range and cross-range values, respectively. In this work, we employ a Genetic Algorithm (GA) based optimization technique [13] to estimate the coefficients of the CCD phase error function (6). The GA optimization works by maximizing the average coherence over the whole change map. The estimated phase error is compensated in the same way as in autofocusing methods. This technique is not reliant on any specific pixel selection and can compensate for phase errors across the entire change map. After CCD phase error compensation, the change map can be generated using (2) or (3). By repeating these steps for other local images, the full CCD map can be formed. Finally, ICD and CCD results can be combined to improve the CD result by using the approaches mentioned in Section 2. 4 EXPERIMENTAL CAMPAIGN The drone-borne SAR system built at the University of Birmingham shown in Figure 2 is used in the experiments. Fundamentally, the SAR system has two sub-systems. One is the drone itself which includes a flight controller, a remote controller, a real-time kinematic (RTK) antenna, and a battery pack. The other one is the radar sensing system which includes an onboard microcontroller (Raspberry Pi), a radar (INRAS Radarbook), a battery, and a laptop as a base station. The collected radar data is stored on the Raspberry Pi and post-processed. Figure 2: (a) Drone-Borne SAR system, (b) Drone-borne SAR system diagram. The GPS/IMU data is extracted from the drone’s flight record log file after flights. The operating frequency of the radar is 24 GHz with a 500 MHz bandwidth. Experiments were conducted at the University of Birmingham. A photograph of the scene taken from the drone can be seen in Figure 3(a). During the experiment, 1-Tx and 1-Rx channel were used. There was grass on the ground that was around 2.5 cm long. In order to georeferenced the target area, a number of corner reflectors were placed throughout the area. The drone was flown at an altitude of roughly 20 m and a velocity of roughly 3 m/s and created 40 m-long SAR apertures. Here, we focus on only one local image shown in the purple rectangle in Figure 3(a) which has a size of 10 m by 16 m. There are four corner reflectors (three in the front and one in the back). The SAR geometry of the trial is shown in Figure 3(b). The slant range to the scene centre was 42 m with an average look angle of 63°. The drone attempted to repeat the same trajectory twice, with a 25-minute temporal separation, and collected the SAR data from these attempts. During the first flight in the scenario, there was a person standing on grass as shown in Figure 3(c). The person moved out from the target area by following the path shown in Figure 3(a) before the second pass, and the car then followed the track in Figure 3(a). Figure 3(d) illustrates the car in visual form. As explained in Section 3, the processing begins with the image formation. Before implementing the space invariant MoCo, the Doppler centroid is set to zero. Strong target selection was based on corner reflectors placed in the scene. Following the space invariant MoCo, RCMC is applied, and the master and slave images are divided into 4 azimuth and 2 range blocks. After that, another PGA is applied to the local image shown in the purple rectangle in Figure 3(a). In this way, residual phase errors in the local image of interest are minimized. The summed phase errors of the space-invariant and space-variant estimations for the local image of interest are shown in Figure 4(a) and (b). The formed master and slave images are illustrated in Figure 4(c) and (d). Images have a 6 cm azimuth and 40 cm ground-range resolution. The standing person and her shadow are obvious in the master image. Figure 3: Experimental scenario; (a) Scene, (b) Data collection geometry, (c) Standing person during the 1st flight, (d) Moving car before the 2nd flight. The next crucial step for change detection is the image co-registration. The checkerboard that includes alternating rectangular regions from master and slave images can be seen in Figure 5(a). The registration problem is clearly visible when the main lobes and side lobes of the corner reflectors are examined. After implementing the intensity-based registration algorithm as described in the previous section, the slave image is aligned with the master image as can be seen in the checkerboard in Figure 5(b). By comparing the images using the incoherent change method, the map shown in Figure 6(a) is obtained. Here, the change is expressed in decibels. As mentioned before, a low-intensity area indicates targets that arrive at the scene whereas a high-intensity area indicates targets that leave the scene. In the scenario, the person is removed from the scene. Hence, the person has a high intensity (almost 6 dB). However, the person’s shadow has low intensity (almost -6 dB). That is because the shadow area shown in the master image is occupied by clutter in the slave image. However, the car tyre marks, and human footprints are not visible. Figure 4: (a) Total estimated phase error for the master image, (b) Total estimated phase error for the slave image, (c) Formed master image, (d) Formed slave image. Figure 5: (a) Checkerboard before registration, (b) Checkerboard after registration Figure 6: (a) ICD map, (b) CCD map before phase error compensation, (c) CCD map formed by the classical coherence estimator after phase error compensation, (d) CCD map formed by the Berger’s coherence estimator, (e) The combined result of the ICD and CCD maps, (f) The CD result after applying an adaptive threshold to (e). After image co-registration, the CCD result with the classical coherence estimator is shown in Figure 6(b). As can be seen, the map is distorted completely, and no change is visible due to phase errors. In order to compensate for these errors, the GA-based optimization method is applied and the coefficients of (6) are estimated. As a result, 0.76 average coherence is achieved after compensating for CCD phase errors. The CCD map is demonstrated in Figure 6(c). Removing the majority of the CCD phase errors has made the car tyre marks and the person who left the scene prior to the second flight apparent in this figure. The decorrelation on the right and left side of the corner reflectors are due to the sidelobes of the reflectors seen in Figure 4(c) and (d). In Figure 6(d), the CCD map formed using Berger’s coherence estimator is shown. In this particular data set, CNR is high and SAR images have almost equal variance. As expected, it is more sensitive to the large-scale changes. Although the car tyre marks remain almost unchanged, the coherence for the person reduces from 0.49 to 0.35, compared with the classical coherence estimator. Regarding combining ICD and CCD results, both the two-stage change detection method and the approach explained in Section 3 give similar results. The combined result of the ICD and CCD maps is demonstrated in Figure 6(e). The CD map is formed by using (4) and (5). Also, the maximum value in the map is set to 1. The person and her shadow are more clearly visible compared with classical and Berger’s coherence estimator. The coherence value of the person is 0.12. When an adaptive threshold described in [14] is applied to this change map, the binary change map shown in Figure 6(f) is obtained. As can be seen, car tyre marks, the person, and even the person’s footprints (in the dashed line rectangle) are detected. 5 CONCLUSION In this paper, a change detection algorithm is proposed for high-resolution, high-frequency, drone borne SAR imagery, and different change detection approaches, namely, incoherent change detection, classical coherence estimator, Berger’s coherence estimator, and combination of ICD&CCD methods are investigated. The experiments performed have demonstrated that the drone borne SAR system operating at 24 GHz has the ability to detect human footprints and car tyre marks. Also, it is seen that Berger’s alternative estimator is more sensitive to large-scale changes than the classical coherence estimator, and the large-scale changes can be more visible when ICD and CCD methods are used together. 6 REFERENCES A. Bekar, M. Antoniou and C. J. Baker, "Low-Cost, High-Resolution, Drone-Borne SAR Imaging," in IEEE Transactions on Geoscience and Remote Sensing, vol. 60, pp. 1-11, 2022. C. V. Jakowatz, Jr., D. E. Wahl, P. H. Eichel, D. C. Ghiglia, and P. A. Thompson, Spotlight mode Synthetic Aperture Radar: A Signal Processing Approach. Boston, USA: Kluwer, 1996. M. Cha, R. D. Phillips, P. J. Wolfe, and C. D. Richmond, “Two-stage change detection for synthetic aperture radar,” IEEE Trans. Geosci.Remote Sens., vol. 53, no. 12, pp. 6547–6560, Dec. 2015. E. J. M. Rignot and J. J. van Zyl, "Change detection techniques for ERS-1 SAR data," in IEEE Transactions on Geoscience and Remote Sensing, vol. 31, no. 4, pp. 896-906, July 1993 M. Preiss, D. A. Gray and N. J. S. Stacy, "Detecting scene changes using synthetic aperture Radar interferometry," in IEEE Transactions on Geoscience and Remote Sensing, vol. 44, no. 8, pp. 2041-2054, Aug. 2006 D. E. Wahl, D. A. Yocky, C. V. Jakowatz and K. M. Simonson, "A New Maximum-Likelihood Change Estimator for Two-Pass SAR Coherent Change Detection," in IEEE Transactions on Geoscience and Remote Sensing, vol. 54, no. 4, pp. 2460-2469, April 2016. M. Wang, G. Huang, J. Zhang, F. Hua and L. Lu, "A Weighted Coherence Estimator for SAR Coherent Change Detection," IEEE Trans. Geosci. Remote Sens, vol. 60, pp. 1-12, 2022. A. Bekar, M. Antoniou and C. J. Baker, "Change Detection for High-Resolution Drone-Borne SAR at High Frequencies - First Results," 2023 IEEE Radar Conference (RadarConf23), San Antonio, TX, USA, 2023, pp. 1-5 T. Berger, “On the correlation coefficient of a bivariate, equal variance, complex Gaussian sample,” Ann. Math. Statist., vol. 43, no. 6, pp. 2000–2003, 1972. D. E. Wahl, D. A. Yocky, C. V. Jakowatz, and K. M. Simonson, “A new maximum-likelihood change estimator for two-pass SAR coherent change detection,” IEEE Trans. Geosci. Remote Sens., vol. 54, no. 4, pp. 2460–2469, Apr. 2016. D. Loeckx, P. Slagmolen, F. Maes, D. Vandermeulen, and P. Suetens, “Nonrigid image registration using conditional mutual information,” IEEE Trans. Med. Imaging, vol. 29, no. 1, pp. 19–29, Jan. 2010. Styner, M., C. Brechbuehler, G. Székely, and G. Gerig. "Parametric estimate of intensity inhomogeneities applied to MRI." IEEE Trans. Med. Imaging. Vol. 19, no. 3, 2000, pp. 153- 165. Holland, John H. Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence. MIT press, 1992. Bradley, D., G. Roth, "Adapting Thresholding Using the Integral Image," Journal of Graphics Tools. Vol. 12, No. 2, 2007, pp.13–21. Previous Paper 2 of 34 Next