Welcome to the new IOA website! Please reset your password to access your account.

Proceedings of the Institute of Acoustics

 

Platform trajectory estimation based on interferometry, DPCA results and consistent imaging

 

H. Schmaljohann, WTD 71, Eckenförde, Germany
B. Bonnett, Helmut Schmidt University, Hamburg, Germany
T. Fickenscher, Helmut Schmidt University, Hamburg, Germany

 

1 INTRODUCTION

 

Generating well focused high-resolution synthetic aperture sonar (SAS) images requires accurate knowledge of the system trajectory, the locations of the scatterers in the scene, and the sound speed. The image reconstruction process involves coherent summation of data from multiple pings. However, the accumulated errors in the assumed positions and sound speed will cause a loss of coherence between pings. A positional accuracy of one-tenth of a wavelength is often used as a criterion.1 As GPS signals do not propagate underwater, system navigation is typically performed using an inertial navigation system (INS) to integrate velocity or acceleration measurements from an initial position. Even a high-grade INS (positional accuracy of 0.01 % of the distance travelled, or 1 cm per 100 m) will not achieve sufficient accuracy over an operationally useful distance.

 

Motion compensation or micronavigation is a widely-applied technique to use the acoustic echo data recorded by the sonar to update the trajectory and improve the ping-to-ping coherence, with the displaced phase centre antenna (DPCA) algorithm2 being the most common method. In addition to improving the image quality, a more accurate knowledge of the system position improves the localization of objects of interest detected in the SAS image. This is beneficial to locating the object again during subsequent missions, especially if an optical system or divers are to be deployed in an area of low visibility.

 

In this paper, we present two modifications to the DPCA algorithm to improve the estimated trajectories. Section 2 describes the system used to collect data on a test mission, and shows that standard motion compensation on this data yields diverging trajectories from the different sensors. In Section 3, we estimate the bathymetry of the seafloor and show that taking it into account can improve the agreement between the trajectories. As the sonar receiver arrays are displaced from the INS, the corrections from the DPCA algorithm must be transformed to the INS position. In Section 4, we propose that the system's depth sensor is accurate, i.e., the INS has minimal vertical drift, and that the remaining accumulated vertical offset between the trajectories is due to a pitch misalignment between the sensors. We show that a pitch correction angle can be found and used to remove this offset. The location of objects in SAS images generated from these trajectories are then compared to their position in multi-beam echo sounder (MBES) backscatter images in Section 5. From this, a yaw misalignment angle can be estimated. We finish with a discussion and conclusions in Section 6.

 

2 SYSTEM AND TEST MISSION

 

The results presented in this paper are from data captured with an Atlas Elektronik Vision1200 SAS mounted on an Atlas Elektronik SeaOtter autonomous underwater vehicle (AUV). This SAS has a vertically separated interferometric pair of receiver arrays mounted on either side of the vehicle with the transmitters mounted between the pairs. The data used here was captured in its high frequency (150 kHz) operating band on a mission off Falshöft in the western Baltic (Figure 1(a)) in September 2016. Nine successive passes along east-west tracks were performed with each leg approximately 1.7 km long and separated by approximately 40 m from the previous leg. Figure 1(b) shows the INS trajectory recorded by the AUV on each of these passes. The data from the highlighted leg is used in all subsequent figures showing results from a single leg. The mission was repeated with the same planned trajectories in March 2023.

 

 

Figure 1: (a) Location of the test mission in the western Baltic. (b) The INS trajectory recorded by the AUV on each leg of the mission. The highlighted trajectory is singled out in later figures. Note that the axes have different scales.

 

The trajectory recorded by the INS onboard the AUV will not be sufficiently accurate to form well focused synthetic aperture images. Motion compensation uses the echo data collected by the SAS to correct the INS trajectory1,3. The DPCA algorithm2 is widely used to perform motion compensation. As a phase centre is the midpoint of the transmitter and a receiver element, an array of receivers thus forms a phase centre array of half the length of the real array. If the system is operated such that the forward distance travelled between pings is less than the length of the phase centre array, then some of the phase centres at the forward end of the array at ping n will overlap some of the phase centres at the aft end of the array at ping n + 1. These overlapping or redundant phase centres should thus record the same echo (ignoring uncorrelated noise). Cross-correlating the data between pings gives an estimate of the shift required to optimally align the data from the redundant phase centres. Fitting a model to the shifts allows estimation of the actual distance travelled by the system between the pings; integrating these ping-to-ping displacements yields an updated trajectory. Figure 2 compares an INS trajectory with four motion-compensated trajectories, each generated using DPCA on the data collected by a different receiver array. As well as differing from the original INS trajectory, the DPCA trajectories differ from each other. If the compensation was ideal, the same trajectory should be generated from all arrays.

 

3 INCORPORATING BATHYMETRY

 

In the standard DPCA processing, it is assumed that the seafloor is a flat tilted plane. Obviously, this is not the case, and the height errors caused by this assumption will lead to errors in the generated trajectories. Previous work has shown the benefit of including height information during motion compensation.4,5 Our processing chain includes a sidescan interferometry step which can estimate a coarse bathymetry of the imaged scene.

 

The first step is to generate a pair of sidescan images from the two vertically separated receiver making up the interferometer. For each ping, a set of focal points on the estimated seafloor plane extending outwards from, and broadside to, the receiver array are found. To simplify the subsequent processing, these points are evenly spaced in slant range. A delay-and-sum beamformer is used to focus the energy from all elements of an array onto these points to form one line of a sidescan image. This is repeated for each ping to build up the complete sidescan image.

 

 

Figure 2: Comparison of the INS trajectory recorded by the vehicle and the motion-compensated trajectories determined with the DPCA algorithm using data from different antennas. Note that the vehicle was travelling west (from right to left) on this leg.

 

The same focal points are used for generating the sidescan images from the two arrays, thus providing an initial coarse coregistration between them. A one-dimensional cross-correlation between the two images is then performed in sliding windows in range. Areas with low correlation are removed, and fine coregistration of each remaining pixel in the images is achieved by fitting a Gaussian curve to the cross-correlation values and finding the location of its peak. An interferogram can now be formed by extracting the interferometric phase of each coregistered window. This can then be unwrapped6 and the position and depth of each original focal point updatedto give a bathymetric dataset of points indexed by ping number and slant range. The depth estimated by this procedure for the example leg is shown in Figure 3.

 

This coarse broadside bathymetry can then be taken into account when aligning the data to be cross correlated during the DPCA algorithm. Figure 4 shows the INS trajectory and the DPCA trajectories resulting from using the bathymetry during processing. Compared to the unaugmented DPCA trajectories in Figure 2, the most obvious change is that the depth component of the trajectories are more consistent. There are also some minor changes in the lateral position the trajectories finish at.

 

4 PITCH CORRECTION

 

The SeaOtter AUV is equipped with a pressure sensor measuring the depth of the system. Although the specific details of the INS processing are not known, since the depth is a directly measured quantity as opposed to an integrated quantity like the lateral position, it is expected that the depth reported by the INS will be reasonably accurate. Therefore, the significant difference in depth at the ends of the compensated trajectories is indicative of an error in the motion compensation. The DPCA algorithm estimates the ping-to-ping motion of the phase centre array in use. This must be referred back to the INS position to be applied to the original trajectory. The INS would typically be mounted near the centre of the system, while the sonar receivers will be mounted on the sides. An error in the relative position of the INS and the receiver would lead to a constant offset in the trajectories.

 

 

Figure 3: The depth of the seafloor to broadside of the vehicle at each ping as estimated by the sidescan interferometry procedure.

 

 

Figure 4: Comparison of the INS trajectory and the motion-compensated trajectories output by the DPCA algorithm taking the broadside bathymetry into account.

 

The varying depth offset observed in Figure 4(b) is, on the whole, increasing with distance travelled. It could therefore be caused by an error in the assumed pitch of the receiver relative to the INS. Assuming that this angular error is constant throughout the trajectory, it can be calculated as

 

 

where d is the horizontal distance travelled by the system, and zend,DPCA and zend,INS are the depths at the end of the DPCA and INS trajectories, respectively. Figure 5 shows the pitch correction angles for the different receiver arrays calculated over all nine legs in the mission. If the seafloor is assumed to be a flat plane, then the starboard pitch correction angles are reasonably consistent between the legs, while there is some variation for the port pitch correction angles; the 2023 mission has a higher variation than the 2016 mission. Using the sidescan bathymetry during DPCA processing results in more consistent values both across the legs of the mission and between the different receivers for the 2016 mission. In the 2023 mission, the correction angles are different for the port and starboard sides, although they are consistent between the two receivers on each side.


 

Figure 5: Pitch correction angles estimated for the different receiver arrays over all legs of the mission. The solid markers are from the 2016 mission and the hollow markers are from the 2023 mission.

 


 

Figure 6: Comparison of the INS trajectory and the motion-compensated trajectories determined with the DPCA algorithm incorporating both the sidescan bathymetry and the pitch correction angle.

 

Figure 6. There are only minor changes in the lateral positions of the DPCA trajectories compared to Figure 4; over all legs in the mission, the maximum deviation was 10 cm. The obvious difference is that the vertical components of the DPCA trajectories now closely match to the INS depth.

 

5 COMPARISON TO MBES DATA

 

MBES data from a mission overlapping the same area of the seafloor was provided by GEOMAR. This data was obtained in 2020 (i.e., approximately four years after the first SAS mission and three years before the second) with a Teledyne SeaBat T51-R device operating at 800 kHz. The stated resolution of this data is 25 cm (approximately ten times worse than the resolution of the generated SAS images) with a positional accuracy of 5 cm throughout the mission; as the MBES system is ship-mounted, the position is regularly updated with high-precision navigation data available on the vessel.

 

SAS images were generated from the upper receiver arrays using the DPCA trajectories by matched filtering and beamforming the received echoes, and then backprojecting them onto a grid formed from the sidescan bathymetry.8 These images were then manually registered to backscatter images from the MBES in QGIS by finding a common proud object close to the start of the DPCA trajectory and shifting the SAS image until this object was aligned between the two images.


 

Figure 7: A comparison of MBES backscatter images and SAS images generated with a DPCA trajectory at either end of the leg. The images were manually registered at the start of the trajectory using proud objects. At the end of the trajectory, they are no longer registered due to lateral errors in the DPCA trajectory.

 

Two areas of the images on the starboard side of the example leg are presented in Figure 7. Comparing Figure 7(a) and Figure 7(b), it can be seen that the reference object at the start of the trajectory is well registered. Note that, due to the elapsed time between the SAS mission and the MBES data collection, minor features in the scene such as the background texture will be different. Figure 7(c) and Figure 7(d) make a similar comparison near the end of the trajectory (approximately 1.1 km to the west of the first pair of images). Due to the remaining lateral errors in the DPCA trajectory, the SAS image is misregistered by 2.7 m in northing (across-track) and 0.3 m in easting (along-track) at this location.

 

Figure 8 shows this misregistration for all legs. The misregistration for the INS trajectory was estimated by adding the final offset between the port DPCA and INS trajectories to the port DPCA misregistrations. The along-track misregistration is largely constant over the legs although it differs between the sides. It is lower in magnitude for the 2016 mission than for the 2023 mission. The across-track DPCA misregistration is similar for the two missions, and is consistent between the sides. The INS misregistration is generally larger for the 2023 mission, but the DPCA processing is able to compensate it to a similar accuracy as the 2016 mission. From the misregistration, a yaw correction angle can be estimated by assuming the across-track drift is constant over the distance between the start and end reference objects. Mathematically, this is given by

 

 

where ySAS,end  and yMBES,end  are the across-track positions of the end reference object in the SAS and MBES images, respectively, and xMBES,end and xMBES,start  are the along-track position of the two reference objects in the MBES image, respectively.


 

Figure 8: Shifts between the MBES and SAS images at the end of the trajectories. The solid markers are from the 2016 mission and the hollow markers are from the 2023 mission.

 

 

Figure 9: Yaw correction angles estimated from the shift between the MBES and SAS images at the end of the trajectories.

 

Figure 9(a) and Figure 9(b) show the yaw correction angle for both sides of the vehicle over all legs for the 2016 and 2023 missions, respectively. In the 2016 mission, after the first two legs the yaw correction angle is reasonably constant with a minor oscillation, presumably due to environmental effects when the vehicle reverses heading between legs. Although information about the currents during the mission is not available, due to the geography of the location the current is likely to be perpendicular to the heading of the vehicle. The yaw correction angles from the 2023 mission have a lower magnitude but there is more drift over the legs. Note that a difference in correction angles is expected as maintenance on the system in the intervening seven years is likely to have resulted in a change of alignment. The environmental conditions between the missions will also have differed.

 

6 DISCUSSION

 

Taking the bathymetry into account during DPCA processing improved the consistency of the output trajectories across the different antennas. Note that the bathymetry of the seafloor in the area of the test mission was relatively smooth. It is expected that the improvement in the DPCA results would be more pronounced for a more complex bathymetry. To ensure consistency between the trajectory and the generated images, the SAS processing should use the same bathymetry when focussing the images.

 

The pitch correction works well for the test mission used here. The depth component of the resulting trajectories closely matches the INS trajectory. Further testing with other missions is required to determine its general applicability. Changes to environmental conditions, e.g., a spatially or temporally varying sound speed, may cause the use of a single value per leg to result in a drift away from the INS depth. It may be more appropriate to calculate this correction angle in a moving window of some number of pings.

 

Calculating the misregistration between the SAS and MBES images manually is not feasible for larger datasets. An automated method, e.g., based on cross-correlation between the images, needs to be developed. As well as allowing more missions to be compared in this manner, the shifts could be calculated at multiple points along the trajectory instead of just at the end. This would allow analysis of how the misregistration changes throughout the trajectory, e.g., is there a constant offset or does it increase in steps at different times. As with the pitch correction, more data is required to determine whether a single value per leg is appropriate. The effect of changes to the scene due to the time difference between the SAS and MBES missions also needs to be investigated, as does the impact of any significant difference in their data processing schemes (e.g., how the sound speed profile within the water column is accounted for).

 

Models could be developed to predict the effect of different sources on the type of analysis presented here. For example, the yaw correction angles (Figure 9) are effectively the same for both port and star board. This suggests that the two receiver arrays are close to parallel (also indicated by Figure 6(a)), and that the angular offset being detected could be a misalignment in the INS.

 

7 REFERENCES

 

  1. Daniel A. Cook. Synthetic aperture sonar motion estimation and compensation. PhD thesis, Georgia Institute of Technology, 2007.
  2. Andrea Bellettini and Marc A. Pinto. Theoretical accuracy of synthetic aperture sonar micron avigation using a displaced phase-centre antenna. IEEE Journal of Oceanic Engineering, 27(4):780--789, October 2002.
  3. Daniel C. Brown, Isaac D. Gerg, and Thomas E. Blanford. Interpolation kernels for syn thetic aperture sonar along-track motion estimation. IEEE Journal of Oceanic Engineering, 45(4):1497--1505, October 2020.
  4. Holger Schmaljohann and Johannes Groen. Motion estimation for synthetic aperture sonars. In EUSAR 2012, 9th European Conference on Sythetic Aperture Radar, pages 78--81, 2012.
  5. Benjamin Thomas. Phase preserving 3D micro-navigation for interferometric synthetic aperture sonar. PhD thesis, University of Bath, 2020.
  6. Torstein Olsmo Sæbø. Seafloor Depth Estimation by means of Interferometric Synthetic Aper ture Sonar. PhD thesis, University of Tromso, 2010.
  7. Martin Kronig. Bathymetry processing for synthetic aperture sonar systems. Master's thesis, Technische Universität Darmstadt, 2014.
  8. Blair Bonnett, Holger Schmaljohann, and Thomas Fickenscher. Resampling of bathymetric data for SAS processing. In Underwater Acoustics Conference and Exhibition (UACE2023), 2023.

 

ACKNOWLEDGEMENTS

 

WTD 71 would like to thank Prof. Greinert at the GEOMAR Helmholtz Centre for Ocean Research Kiel for providing the MBES data used in this paper.