Home

Awesome

<div align="center"> <img src="artwork/trackastra_logo.png" alt="Optimus Prime" style="width:25%;"/> </div>

Trackastra - Tracking by Association with Transformers

Trackastra is a cell tracking approach that links already segmented cells in a microscopy timelapse by predicting associations with a transformer model that was trained on a diverse set of microscopy videos.

Overview

If you are using this code in your research, please cite our paper:

Benjamin Gallusser and Martin Weigert<br>Trackastra - Transformer-based cell tracking for live-cell microscopy<br> European Conference on Computer Vision, 2024

Examples

Nuclei trackingBacteria tracking
<video src='https://github.com/weigertlab/trackastra/assets/8866751/807a8545-2f65-4697-a175-89b90dfdc435' width=180></video><video src='https://github.com/weigertlab/trackastra/assets/8866751/e7426d34-4407-4acb-ad79-fae3bc7ee6f9' width=180/></video>

Installation

This repository contains the Python implementation of Trackastra.

Please first set up a Python environment (with Python version 3.10 or higher), preferably via conda or mamba.

Trackastra can then be installed from PyPI using pip:

pip install trackastra

For tracking with an integer linear program (ILP, which is optional)

conda create --name trackastra python=3.10 --no-default-packages
conda activate trackastra
conda install -c conda-forge -c gurobi -c funkelab ilpy
pip install "trackastra[ilp]"

Notes:

Usage

The input to Trackastra is a sequence of images and their corresponding cell (instance) segmentations.

Napari plugin

For a quick try of Trackastra on your data, please use our napari plugin, which already comes with pretrained models included.

demo

Tracking with a pretrained model

The available pretrained models are described in detail here.

Consider the following python example script for tracking already segmented cells. All you need are the following two numpy arrays:

The predicted assocations can then be used for linked with several modes:

Apart from that, no hyperparameters to choose :)

import torch
from trackastra.model import Trackastra
from trackastra.tracking import graph_to_ctc, graph_to_napari_tracks
from trackastra.data import example_data_bacteria

device = "cuda" if torch.cuda.is_available() else "cpu"

# load some test data images and masks
imgs, masks = example_data_bacteria()

# Load a pretrained model
model = Trackastra.from_pretrained("general_2d", device=device)

# or from a local folder
# model = Trackastra.from_folder('path/my_model_folder/', device=device)

# Track the cells
track_graph = model.track(imgs, masks, mode="greedy")  # or mode="ilp", or "greedy_nodiv"


# Write to cell tracking challenge format
ctc_tracks, masks_tracked = graph_to_ctc(
      track_graph,
      masks,
      outdir="tracked",
)

You then can visualize the tracks with napari:

# Visualise in napari
napari_tracks, napari_tracks_graph, _ = graph_to_napari_tracks(track_graph)

import napari
v = napari.Viewer()
v.add_image(imgs)
v.add_labels(masks_tracked)
v.add_tracks(data=napari_tracks, graph=napari_tracks_graph)

Training a model on your own data

To run an example

Now, run

python train.py --config example_config.yaml

Generally, training data needs to be provided in the Cell Tracking Challenge (CTC) format, i.e. annotations are located in a folder containing one or several subfolders named TRA, with masks and tracklet information.