Awesome
MVOC: a training-free multiple video object composition method with diffusion models
<img src="assets/logo.png" />This repo is an official pytorch implementation of the paper "MVOC: a training-free multiple video object composition method with diffusion models"
Introduction
MVOC is a training-free multiple video object composition framework aimed at achieving visually harmonious and temporally consistent results.
<img src="https://sobeymil.github.io/mvoc.com/mvoc_intro.png" />Given multiple video objects (e.g. Background, Object1, Object2), our method enables presenting the interaction effects between multiple video objects and maintaining the motion and identity consistency of each object in the composited video.
▶️ Quick Start for MVOC
Environment
Clone this repo and prepare Conda environment using the following commands:
git clone https://github.com/SobeyMIL/MVOC
cd MVOC
conda env create -f environment.yml
Pretrained model
We use i2vgen-xl to inverse the videos and compose them in a training-free manner. Download it from huggingface and put it at i2vgen-xl/checkpoints
.
Video Composition
We offer the videos we use in the paper, you can find it at demo
First, you need to get the latent representation of the source videos, we offer the inversion config file at i2vgen-xl/configs/group_inversion/group_config.json
.
Then you can run the following command:
cd i2vgen-xl/scripts
bash run_group_ddim_inversion.sh
bash run_group_composition.sh
Results
We provide some composition results in this repo as below.
Demo | Collage | Our result |
---|---|---|
boat_surf | ||
crane_seal | ||
duck_crane | ||
monkey_swan | ||
rider_deer | ||
robot_cat | ||
seal_bird |
🖊️ Citation
Please kindly cite our paper if you use our code, data, models or results:
@inproceedings{wang2024mvoc,
title = {MVOC: a training-free multiple video object composition method with diffusion models},
author = {Wei Wang and Yaosen Chen and Yuegen Liu and Qi Yuan and Shubin Yang and Yanru Zhang},
year = {2024},
booktitle = {arxiv}
}
🎫 License
This project is released under the MIT License.
💞 Acknowledgements
The code is built upon the below repositories, we thank all the contributors for open-sourcing.