Home

Awesome

DART: Doppler-Aided Radar Tomography

Implementation of DART: Implicit Doppler Tomography for Radar Novel View Synthesis

DART method overview.

Setup

  1. Ensure that you have python (>=3.8), CUDA (>=11.8), and CUDNN.

  2. Install jax. Note that you will need to manually install jax-gpu to match the cuda version:

    pip install --upgrade "jax[cuda11_local]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
    

    for CUDA 11.x.

    NOTE: jax is not included in requirements.txt due to requiring CUDA-dependent manual installation.

  3. Install libhdf5:

    sudo apt-get -y install libhdf5-dev
    
  4. Install python dependencies:

    pip install -r requirements.txt
    
    • Use Python 3.11, CUDA 11.8, Jax 0.4.10, and pip install -r requirements-pinned.txt to get the exact version of dependencies that we used.
  5. Prepare datasets.

    • If downloading our datasets, you will need to create a formatted dataset (data.h5) for training:
      python manage.py dataset -p data/path/to/dataset
      
    • The dataset format is also documented on the datasets page.

Usage

TL;DR:

TARGET=output DATASET=lab-1 make experiment

With arguments:

TARGET=output METHOD=ngp DATASET=lab-1 FLAGS="---epochs 5" make experiment

This creates the following files in results/output:

results/
    output/
        metadata.json       # Model/dataset/training metadata
        model.chkpt         # Model weights checkpoint
        pred.h5             # Predicted range-doppler images
        cam.h5              # Virtual camera renderings for the trajectory
        map.h5              # Map of the scene sampled at 25 units/meter
        output.video.mp4    # Output camera + radar video
        output.map.mp4      # Video where each frame is a horizontal slice
    ...

Multiple models on the same trajectory can also be combined into a single output video:

python manage.py video -p results/output results/output2 ... -f 30 -s 512 -o results/video.mp4

Available Commands

See -h for each command/subcommand for more details.

train.py: train model; each subcommand is a different model class.

manage.py: evaluation and visualization tools: