Home

Awesome

n5-ij Build Status

A Fiji plugin for loading and saving image data to N5 containers. Supports HDF5, Zarr, Amazon S3, and Google cloud storage.

Contents

  1. Open HDF5/N5/Zarr/OME-NGFF
    1. Virtual
    2. Cropping
  2. Export N5
  3. Container types
  4. Metadata
  5. For developers
  6. Details
    1. Cloud writing benchmarks

Open HDF5/N5/Zarr/OME-NGFF

Open HDF5/N5/Zarr/OME-NGFF datasets from Fiji with File > Import > HDF5/N5/Zarr/OME-NGFF ... .

Quickly open a dataset by pasting the full path to the dataset and press OK. For example, try gs://example_multi-n5_bucket/mitosis.n5/raw to open the sample mitosis image from Google cloud storage.

Click the Browse button to select a folder on your filesystem.

<img src=https://raw.githubusercontent.com/saalfeldlab/n5-ij/master/doc/OpenN5DialogWithBrowse.png width="600">

The detected datasets will be displayed in the dialog. Select (highlight) the datasets you would like to open and press OK. In the example below, we will open the datasets /blobs, and /t1-head/c0/s0.

<img src=https://raw.githubusercontent.com/saalfeldlab/n5-ij/master/doc/OpenN5DialogWithTree.png width="600">

Virtual

Check the Open as virtual box to open the dataset as a virtual stack in ImageJ. This enable the opening and viewing of image data that do not fit in RAM. Image slices are loaded on-the-fly, so navigation will be slow when parts of the images are loaded.

Cropping

Subsets of datasets can be opened by checking the Crop box in the dialog, then pressing OK. A separate dialog will appear for each selected dataset as shown below.

<img src=https://raw.githubusercontent.com/saalfeldlab/n5-ij/master/doc/OpenN5DialogWithCrop.png width="700">

Give the min and max values for the field-of-view to open in pixel / voxel units a particular subset. The opened interval includes both min and max values, so the image will be of size max - min + 1 along each dimension. In the example shown above, the resulting image will be of size 101 x 111 x 2 x 51.

Export HDF5/N5/Zarr/OME-NGFF

Save full images opened in Fiji as HDF5/N5/Zarr/OME-NGFF datasets with File > Save As > Export HDF5/N5/Zarr/OME-NGFF ..., and patch images into an existing dataset using File > Save As > Export HDF5/N5/Zarr/OME-NGFF (patch). The patch export exports the current image into an existing dataset with a given offset. This offset and the size of the image do not need to align with the block raster of the dataset.

<img src=https://raw.githubusercontent.com/saalfeldlab/n5-ij/master/doc/SaveN5Dialog.png width="280">

Parameters

Container types

If the "Format" option is set to Auto, the export plugin infers the storage format from the given url given. First, it checks scheme of the URL, next it checks the directory or file extension. Note that URLs may have two schemes (as in neuroglancer), for example: zarr://s3://my-bucket/my-key

Backend

Specify the backend by protocol, "file:" or not protocol indicate the local file system:

Multi-scale pyramids

How many scale levels will be created

The number of scale levels is determined by the image size and specified block size. The exporter will downsample an image only if the block size is striclty smaller than the image size in every dimension.

Example 1

If image size is 100 x 100 x 100 and the block size is 100 x 100 x 100 will write one scale level because the whole image fits into one block at the first scale level.

Example 2

If image size is 100 x 100 x 100 and the block size is 64 x 64 x 64 will write two scale levels: The first scale level will have a 2 x 2 x 2 grid of blocks.

Example 3

If image size is 100 x 100 x 32 and the block size is 64 x 64 x 64 will write one scale level because the third dimension of the image is smaller than the third dimension of the block.

Downsampling

The N5 exporter always downsamples images by factors of two in all dimensions. There are two downsampling methods:

Sample

N5 will take even-indexed samples and discard odd-indexed samples along each dimension.

Averaging

N5 will average adjacent samples along each dimension. This results in a "half-pixel" shift, which will be reflected in the metadata.

Overwriting

Warning messages

If the overwrite option is not selected in the dialog, the exporter will determine if the write operation would overwrite or invalidate existing data. If so, it prompts the user with a warning dialog, asking if data should be overwritten.

Data could be overwritten if:

If the overwrite option is selected in the initial dialog, the user will not be prompted, but data will be overwritten if needed as explained below.

Example 1

A dataset exists at /image. User attempts to write data into /image. This warns the user about overwriting because an array already exists at that location.

Example 2

A dataset exists at /data/s0. User attempts to write data into /data using N5Viewer metadata. This warns the user about overwriting because when writing N5Viewer metadata, the plugin will write the full resolution level of the multiscale pyramid at location /data/s0, but an array already exists at that location.

Overwriting removes existing data

If the user decides to overwite data the N5 exporter will completely (array data and metadata) remove any groups that cause conflicts before writing the new data.

Example 3

A dataset exists at /image. User attempts to write data into /image/channel0. This warns the user about overwriting because the newly written data would be a child path of existing data, and therefore be invalid. If the user decides to overwrite, all data at /image will be removed before writing the new data to /image/channel0.

Metadata

This plugin supports three types of image metadata:

  1. ImageJ-style metadata
  2. N5-viewer metadata
  3. COSEM metadata
  4. OME-NGFF v0.4 metadata
  5. Custom metadata. Read details here

The metadata style for exported HDF5/N5/Zarr/OME-NGFF datasets is customizable, more detail coming soon.

For developers

ImageJ convenience layer for N5

Build into your Fiji installation:

mvn -Dscijava.app.directory=/home/saalfelds/packages/Fiji.app -Ddelete.other.versions=true clean install

Then, in Fiji's Script Interpreter (Plugins > Scripting > Script Interpreter), load an N5 dataset into an ImagePlus:

import org.janelia.saalfeldlab.n5.*;
import org.janelia.saalfeldlab.n5.ij.*;

imp = N5IJUtils.load(new N5FSReader("/home/saalfelds/example.n5"), "/volumes/raw");

or save an ImagePlus into an N5 dataset:

import ij.IJ;
import org.janelia.saalfeldlab.n5.*;
import org.janelia.saalfeldlab.n5.ij.*;

N5IJUtils.save(
    IJ.getImage(),
    new N5FSWriter("/home/saalfelds/example.n5"),
    "/volumes/raw",
    new int[] {128, 128, 64},
    new GzipCompression()
);

Save an image stored locally to cloud storage (using four threads):

final ImagePlus imp = IJ.openImage( "/path/to/some.tif" );
final ExecutorService exec = Executors.newFixedThreadPool( 4 );
N5IJUtils.save( imp, 
    new N5Factory().openWriter( "s3://myBucket/myContainer.n5" ), 
    "/myDataset", 
	new int[]{64, 64, 64},
	new GzipCompression(), 
	exec );
exec.shutdown();

See also scripts demonstrating

Details

Cloud writing benchmarks

Below are benchmarks for writing images of various sizes, block sizes, and with increasing amount of parallelism.

Amazon S3

Time in seconds to write the image data. Increased parallelism speeds up writing substantially when the total number of blocks is high.

Image sizeBlock size1 thread2 threads4 threads8 threads16 threads
64x64x6432x32x320.980.600.450.500.51
128x128x12832x32x324.722.641.621.00
256x256x25632x32x3237.0919.119.095.203.2
256x256x25664x64x6410.565.043.232.171.86
512x512x51232x32x32279.28156.8974.7237.1519.77
512x512x51264x64x6476.6338.1619.8610.166.14
512x512x512128x128x12827.1614.328.014.703.31
1024x1024x102432x32x322014.73980.66483.04249.83122.36
1024x1024x102464x64x64579.46289.53149.9875.8538.18
1024x1024x1024128x128x128203.47107.2355.1127.4115.33