Real-Time Avatar Pose Transfer and Motion Generation Using Locally Encoded Laplacian Offsets
We propose a human avatar representation scheme based on intrinsic coordinates, which are invariant to isometry and insensitive to human pose changes, and an efficient pose transfer algorithm that can utilize this representation to reconstruct a human body geometry following a given pose. Such a pose transfer algorithm can be used to control the movement of an avatar model in virtual reality environments following a user’s motion in real time. Our proposed algorithm consists of three main steps. First, we recognize the user’s pose and select a template model from the database who has a similar pose; then, the intrinsic Laplacian offsets encoded in local coordinates are used to reconstruct the human body geometry following the template pose; finally, the morphing between the two poses is generated using a linear interpolation. We perform experiments to evaluate the accuracy and efficiency of our algorithm. We believe our proposed system is a promising human modeling tool that can be used in general virtual reality applications.
Publication Source (Journal or Book title)
Journal of Computer Science and Technology
Lifkooee, M., Liu, C., Liang, Y., Zhu, Y., & Li, X. (2019). Real-Time Avatar Pose Transfer and Motion Generation Using Locally Encoded Laplacian Offsets. Journal of Computer Science and Technology, 34 (2), 256-271. https://doi.org/10.1007/s11390-019-1909-9