TY - GEN
T1 - Application of a Workflow for the Construction and Interactive Development of Digital Humans From Real-Person Facial Photos
T2 - 5th International Conference on Intelligent Design, ICID 2024
AU - Yang, Tianlun
AU - Hu, Zhenyu
AU - Kapogiannis, Georgios
AU - Kang, Byung Gyoo
AU - Wu, Yanhui
AU - Yang, Aixi
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Digital humans act as pivotal interfaces between the virtual and physical dimensions, significantly influencing interactive design within the Metaverse. As the Metaverse concept gains traction across diverse sectors, the implementation of technologies such as digital humans has become increasingly prevalent. The Metaverse provided virtual environment serves as an online hub for user socialization and a variety of activities. A significant proportion of users express a desire to create virtual avatars that closely reflect their own physical appearance. Although various technical solutions for avatar creation exist, they often fail to satisfy the specific demand for digital humans that accurately represent users' real-life appearances. Traditional 3D modeling and scanning methods, which require expensive equipment and significant time commitments, are also not optimal for supporting advanced interactive applications. Therefore, there is a pressing need in the industry for a streamlined workflow that can rapidly generate digital humans based on actual facial features. This research examines the interactive design necessities of the Metaverse and assesses current technologies for creating digital humans from users' facial images. It further investigates the potential for integrating these technologies in a synergistic manner to address the current shortfall. The proposed workflow in this study is intended to serve as a benchmark for future advancements in digital human development within the Metaverse.
AB - Digital humans act as pivotal interfaces between the virtual and physical dimensions, significantly influencing interactive design within the Metaverse. As the Metaverse concept gains traction across diverse sectors, the implementation of technologies such as digital humans has become increasingly prevalent. The Metaverse provided virtual environment serves as an online hub for user socialization and a variety of activities. A significant proportion of users express a desire to create virtual avatars that closely reflect their own physical appearance. Although various technical solutions for avatar creation exist, they often fail to satisfy the specific demand for digital humans that accurately represent users' real-life appearances. Traditional 3D modeling and scanning methods, which require expensive equipment and significant time commitments, are also not optimal for supporting advanced interactive applications. Therefore, there is a pressing need in the industry for a streamlined workflow that can rapidly generate digital humans based on actual facial features. This research examines the interactive design necessities of the Metaverse and assesses current technologies for creating digital humans from users' facial images. It further investigates the potential for integrating these technologies in a synergistic manner to address the current shortfall. The proposed workflow in this study is intended to serve as a benchmark for future advancements in digital human development within the Metaverse.
KW - Digital Human
KW - Digital Twin
KW - Interactive Technologies
KW - Metaverse
UR - https://www.scopus.com/pages/publications/105009112880
U2 - 10.1109/ICID64166.2024.11024597
DO - 10.1109/ICID64166.2024.11024597
M3 - Conference contribution
AN - SCOPUS:105009112880
T3 - 2024 5th International Conference on Intelligent Design, ICID 2024
SP - 459
EP - 466
BT - 2024 5th International Conference on Intelligent Design, ICID 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 25 October 2024 through 27 October 2024
ER -