匿名用户2024年07月31日
25阅读
所属分类aipytorch
开源地址https://modelscope.cn/models/AI-ModelScope/MotionCtrl
授权协议apache-2.0

作品详情

MotionCtrl Model Card

[**Project Page**](https://wzhouxiff.github.io/projects/MotionCtrl/) **|** [**Paper (ArXiv)**](https://arxiv.org/pdf/2312.03641.pdf) **|** [**Code**](https://github.com/TencentARC/MotionCtrl) [? **Gradio demo (MotionCtrl+VideoCrafter)**](https://huggingface.co/spaces/TencentARC/MotionCtrl) **|** [? **Gradio demo (MotionCtrl+SVD)**](https://huggingface.co/spaces/TencentARC/MotionCtrl_SVD)

Introduction

MotionCtrl, when provided with specific camera movements indicated by camera poses or several trajectories, can independently control both the camera and object motion in a generated video. It can be deployed on both Text-to-Video generation models, such as LVDM/VideoCrafter and AnimateDiff, as well as Image-to-Video generation models like SVD.

MotionCtrl + SVD

MotionCtrl + LVDM/VideoCrafter

MotionCtrl + AnimateDiff

  • Model: [Coming Soon]()
  • Results:

Usage

  • Download directly in this repository OR Download with python script
from huggingface_hub import hf_hub_download
motionctrl_lvdm_ckpt = hf_hub_download(repo_id="TencentARC/MotionCtrl", filename="motionctrl.ckpt", repo_type="model")
motionctrl_svd_ckpt = hf_hub_download(repo_id="TencentARC/MotionCtrl", filename="motionctrl_svd.ckpt", repo_type="model")
  • Then generate the controllable video following the instructions in our GitHub repository.

Citation

@inproceedings{wang2023motionctrl,
  title={MotionCtrl: A Unified and Flexible Motion Controller for Video Generation},
  author={Wang, Zhouxia and Yuan, Ziyang and Wang, Xintao and Chen, Tianshui and Xia, Menghan and Luo, Ping and Shan, Yin},
  booktitle={arXiv preprint arXiv:2312.03641},
  year={2023}
}
声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论