Home  | Publications | WAT+24

Beyond Ultra-NeRF: Explainable Neural Fields for Ultrasound

MCML Authors

Abstract

Current ultrasound image synthesis techniques often fall short in semantic accuracy and physical realism or produce images with a significant domain gap. Ultra-NeRF addresses these issues by creating a Neural Field from reconstructed acoustic properties via pose-annotated B-mode images and shows that it can be used for novel view synthesis of B-mode images. While Ultra-NeRF generates plausible results, it lacks explainability in the acoustic parameter space. In this paper, we revisit neural fields for ultrasound and introduce the Sonographic Neural Reflection Field (SuRF), which adheres to the physical properties of acoustic ultrasound. By redesigning Ultra-NeRF’s differentiable forward synthesis model and incorporating physics-inspired regularizations, we ensure the interpretability of learned acoustic parameters. Tested on the Ultra-NeRF in-silico dataset and a new multi-view ex-vivo 3D ultrasound dataset, our method demonstrates enhanced reconstruction and interpretation across various tissue types, including fat, muscle, and bone.

inproceedings


NeuralBCC @ECCV 2024

1st Workshop on Neural Fields Beyond Conventional Cameras at the 18th European Conference on Computer Vision. Milano, Italy, Sep 29-Oct 04, 2024.

Authors

M. WysockiM. F. AzampourF. TristramB. BusamN. Navab

Links

URL

Research Areas

 B1 | Computer Vision

 C1 | Medicine

BibTeXKey: WAT+24

Back to Top