Neural Implicit representations for 3D scene modeling and visualization using multipass SAR

Konferenz: EUSAR 2024 - 15th European Conference on Synthetic Aperture Radar
23.04.2024-26.04.2024 in Munich, Germany

Tagungsband: EUSAR 2024

Seiten: 6Sprache: EnglischTyp: PDF

Autoren:
Sugavanam, Nithin; Ertin, Emre; Jamora, Jan Rainer

Inhalt:
Synthetic aperture radar (SAR) collects samples of the 3D Spatial Fourier transform on a two-dimensional surfaces corresponding to the backscatter data of wideband pulses launched from different look angles along the synthetic aperture. In this paper we present a coordinate-based multi-layer perceptron (MLP) that enforces the smooth surface-prior and models the complex-valued scattering coefficients on the surface of the imaged object. The resulting implicit neural representation is capable of predicting phase history data for arbitrary apertures in the span of the training data. The 3D surface is represented using a signed distance function, while the scattering coefficients are represented using real and imaginary channels. Since estimating a smooth surface from a sparse and noisy point cloud is an ill-posed problem, in this work, we build on our previous work to regularize the surface estimation by sampling points from the implicit surface representation during the training step. The loss function also incorporates the consistency between the predicted complex-valued scattering coefficients and the true scattering coefficients. We validate the model’s ability to represent target scattering using simulated data of Civilian vehicle data domes.