Home

Awesome

UnityVolumeRendering

A volume renderer, made in Unity3D. I have written a tutorial explaining the basic implementation. Have any questions? Create an issue or contact me on Mastodon or 微信: mati31415.

I also have a tutorial video that shows how to use the project

NEWS: This plugin has now been ported to Godot Engine, thanks to Riccardo Lops: https://github.com/riccardolops/GodotVolumetricRendering

alt tag

Thanks to JetBrains for providing open source license for Rider - the best C# IDE :)

Documentation
See full documentation here

Table of contents

This Readme contains a quick introduction to the library. For more info, see the complete documentation.

Requirements

How to use sample scene

Step-by-step instructions

1. Import model

Raw datasets:

In the menu bar, click "Volume Rendering" and "Load raw dataset"

<img src="Screenshots/menubar2.png" width="200px">

Then select the dataset you wish to import..

In the next menu you can optionally set the import setting for the raw dataset. For the sample files you don't need to change anything.

<img src="Screenshots/import.png" width="200px">

DICOM:

To import a DICOM dataset, click "Volume Rendering" and "Load DICOM" and select the folder containing your DICOM files. The dataset must be of 3D nature, and contain several files - each being a slice along the Z axis.

2. Moving the model

You can move the model like any other GameObject. Simply select it in the scene view or scene hierarchy, and move/rotate it like normal.

<img src="Screenshots/movement.gif" width="400px">

3. Changing the visualisation

Select the model and find the "Volume Render Object" in the inspector.

Here you can change the "Render mode":

<img src="Screenshots/rendermode.png" width="200px">

Example:

<img src="Screenshots/rendermodes.gif" width="500px">

There are 3 render modes:

There are also some other settings that you can adjust:

<img src="Screenshots/volume-inspector-settings.jpg" width="300px">

Direct Volume Rendering

Direct volume rendering is the most standard rendering mode. It sends rays through the dataset, and uses "transfer functions" (1D or 2D) to determine the colour and opacity. Transfer functions map density (2D: also gradient magnitude) to a colour and opacity.

Isosurface Rendering

Isosurface rendering draws the first thing the ray hits, with a density higher than some threshold. You can set this threshold yourself, by selecting the object and changing the "Visible value range" in the inspector. These can also be used with direct volume rendering mode.

<img src="Screenshots/isosurface.gif" width="500px">

Importing DICOM and NRRD

If you're on Windows or Linux, I recommend enabling the SimpleITK importer, which is a requirement for JPEG2000 compressed DICOM and NRRD.

How to use in your own project

See the importer documentation for more detailed information.

FAQ (Frequently Asked Questions)

How to preserve real world scale of my datasets?

Imported datasets are automatically normalised, to make sure datasets where the scale unit info is missing or wrong don't become too large or small. You can undy this simply by setting the scale of the outer GameObject (the one containing the VolumeRenderedObject component) to 1,1,1.

Does this work in VR?

Yes, however you will need to change "stereo rendering mode" to "multi pass" in the XR settings in Unity. See #71.

What about VR performance?

Since VR requires two cameras to render each frame, you can expect worse performance. However, you can improve the FPS in two ways:

Your bottleneck will most likely be the pixel/fragment shader (where we do raymarching), so it might be possible to get better performance by enabling DLSS. This requires HDRP, which this project currently does not officially support (but it might still work fine).

Also, some users have reporter having significantly lower performance with OpenXR mode, compared to OpenVR. It might we worth a try to switch between these.

Can I use WebGL?

Yes! But keep in mind that memory will be limited, so you might not be able to load very large datasets.

I recommend that you enable ALLOW_MEMORY_GROWTH. See #125 for more info.

Also, since WebGL builds do not have access to your local filesystem, you will not be able to upload files directly (using the runtime GUI in the sample scene, etc.). You can either:

Is this project free to use?

Yes, it's free even for commercial projects. The license (MIT) only requires attribution and a copyright/license notice.

How can I make it look better?

How can I get better rendering performance with lighting enabled?

If you're on a platform that supports it (Windows, etc.), try enabling DLSS (HDRP) or FidelityFX Super Resolution (URP) and reduce the render scale.

How can I raycast the scene to find an intersection?

I'm stuck! How can I get help?

Create an issue. You can also reach me on the fediverse.

Contributing

See CONTRIBUTING.md for how to contribute.

Thanks to everyone who have contributed so far.

See ACKNOWLEDGEMENTS.txt for libraries used by this project.

Ricer IDE license kindly provided by JetBrains.

<img src="https://resources.jetbrains.com/storage/products/company/brand/logos/jb_beam.png" width="200px">