IIIT-Hyderabad set to digitise Indian classical dance forms
Hyderabad: What does 3D, 4D or information technology have to do with dance? Ask researchers at the International Institute of Information Technology, Hyderabad’s (IIIT-H) Centre for Visual Information Technology, who are huddled in their Motion Capture Lab, contributing to the preservation of India’s dance heritage.
“This is part of the Government of India’s Cultural Heritage Conservation Project,” Prof Avinash Sharma, in charge of the effort, told IIIT-H blogger Sarita Chebbi.
“The idea is to preserve Indian classical dance forms such as Bharatanatyam, Mohiniyattam, Kathak and so on in a 4D digitised format. Dance movements are recorded in a lab setup and can be viewed in 4D on a virtual stage,” says Prof Sharma, adding that it is the nuanced kind of data that is required before scientific analysis and research on dance forms.
The analysis could be on postures or the sequence of actions, which are invaluable from the point of view of heritage preservation. Besides, such footage can also provide an immersive experience to remote audiences via an online medium.
Motion-capture technology, which is already common in gaming, VR avatars and the animation industry, is now gaining ground in the creation of digital avatars in the Metaverse. “We see it in movies wherein markers on face and facial gestures are captured and mapped to 3D models of real people so that the animated version talks and gesticulates like those people,” says Prof Sharma.
However, realism can be obtained only when nitty-gritties of the bodily appearance, including surface details of skin and garments, hair and so on, are captured. “The more realistic the avatar, the more immersive can your experience in the Metaverse be,” he says.
While capture and analysis is one part of the project, another is virtual try-on technology, which is digitally trying on clothes or accessories in a virtual 3D environment. Here, online shoppers can select garments, which are already 3D digitised, and drape it on a synthetic model or use their own 3D avatars. “We intend to pursue this for commercial use in the near future,” Prof Sharma says, adding that they are also looking at digitising a synthetic avatar into a lifelike-one by personalising it to a specific appearance of an individual.
For this, his team used a commercial 3D scanner, took aesthetic captures of about 250 individuals in a variety of clothing, ranging from tight-fitting trousers and t-shirts to the more flowing South-Asian ethnic attire, and created a dataset named 3DHumans.
“The idea is to make the dataset freely available for the academic community to use with appropriate licensing in order to democratise research in this domain,” he says. “Plans are under way to extend the dataset of 250 images and include accessories. A dynamic dance dataset is already in the pipeline and will be released soon,” he adds.