Home

Awesome

<p align="center"> <img src="./Documentation/Images/banner.jpg"> </p>

Synthetic Homes

Unity Synthetic Homes is a dataset generator and accompanying large-scale dataset of photorealistic randomized home interiors, built for training computer vision models such as object detection, scene understanding, and monocular depth estimation.

The application performs a wide variety of randomizations to maximize the diversity of generated datasets. These include materials, furniture type and configuration, sunlight angle and temperature, day/night switching, interior lighting temperature, camera angles, clutter, skybox, door and curtain animations, and more. By providing a configuration file, you control customize many of these elements, enabling you to tune them to your liking.

This applications was made using the Unity Perception package, which provides tools for generating randomized synthetic CV datasets with a wide variety of ground-truth annotations.

Interior lighting in homes is complex and is difficult to replicate with traditional raster-based methods. We used Unity’s multi-bounce path tracing to accomplish physically accurate global illumination and reflections. This accuracy can help bridge the so called “Sim2Real gap”, improving a model’s ability to perform well in the real world after training on synthetic data.

Included Label Types

Analyzing and Visualizing Datasets

Datasets are generated in the SOLO format. For information on how to explore and analyze SOLO datasets, check out the endpoint and schema documentation pages on the Perception repository.

Dataset Generator

Head over to the Releases page and download the latest build. Once the archive is extracted, you can simply double click SyntheticHomes.exe to run the application with its default settings. A window will be opened and frames will start to be randomized and rendered. Each final frame will take a while to render as we accumulate multiple frames to achieve high quality path traced results.

By default, the generated dataset will be located at %USERPROFILE%\AppData\LocalLow\UnityTechnologies\SyntheticHomes. You can browse through the dataset while the generator is running.

Alternatively, you can supply command line arguments and an optional JSON configuration file to modify various settings of your run.

System Requirements

Command Line Arguments

Configuration File

Several aspects of the dataset generation can be controlled using a JSON config file that is provided to the application. A sample config file is provided here.

The configuration file includes three main sections: (a) constants, (b) sensors, and (c)randomizers.

In the constants block, you specify:

In the sensors block, you can enable or disable sensors and their Labelers. This project contains a single sensor of type PerceptionCamera, so there is no point in disabling it. However, you can disable Labelers that you do not need to reduce the size of the output dataset by modifying the enabled field of each Labeler in the labelers array of the PerceptionCamera. If no JSON config is provided, all Labelers are activated.

In the randomizers block of the JSON config file, you control the behavior of Perception Randomizers included in the project. Each Randomizer has one JSON block in this file. randomizerId denotes the name of the Randomizer, and items are the list of settings that can be changed. To change the behavior for each item, what you will need to modify is the value blok nested inside. In addition, some Randomizers can be completely disabled. We will now go through all the available Randomizers and their settings:

License

Citation

If you find SynthHomes useful, consider citing it using:

@misc{SyntHomes,
    title={Unity SynthHomes: A Synthetic Home Interior Dataset Generator},
    author={{Unity Technologies}},
    howpublished={\url{https://github.com/Unity-Technologies/SynthHomes}},
    year={2022}
}