In the autumn of 2011, when Praneeth Namburi PhD '16 was a graduate student at MIT, he recalls “stumbling into a room full of people dancing.” Which is how Namburi took his first lesson with the MIT Ballroom Dance Team—by accident. “The instructor just put me in front of a partner,” he says. Yet despite his serendipitous introduction to the sport, Namburi chose to go back week after week, first as a respite from the rigors of his degree in experimental neuroscience and then, when he got deeper into it, “as a way of exploring and understanding how my body works.”
In 2018, returning from a ballroom competition at Columbia University, Namburi experienced a sudden flash of insight inspired by all his time practicing and learning dance. In that moment, he understood that to dance is to move one’s body as a whole—as a single, unbroken form. Namburi logged into Amazon and bought a few motion sensors to stick to his body as he danced. By looking at the data from those sensors before and after learning a new routine, Namburi gained a quantitative means of describing his mastery of movement. It got him thinking about whether he could use that information to help others learn. “Everybody has the potential,” he says. “Most people just haven’t discovered it yet.”
Today, Namburi is a research scientist at the MIT.nano Immersion Lab, a shared, central facility housed within the Lisa T. Su Building that enables research visualizing and interacting with large, multidimensional data sets. Projects span departments and disciplines, ranging from 3-D content development to hardware design to human-subject research and bringing together engineers, scientists, artists, and athletes.
The characteristics of efficient motion that we see in dance help us understand what it means to be efficient or fatigued in other domains.
One project that bridges the physical and digital worlds uses virtual reality (VR) simulations to train people to fabricate computer chips and semiconductors. This allows individuals to “visualize and interact with the tools virtually before you put them in front of something where they can hurt themselves or the equipment,” says Brian W. Anthony SM ’98, PhD ’06, associate director of MIT.nano and the Immersion Lab’s founding director.
In addition, the data from a single measurement from one of MIT.nano’s sophisticated characterization tools, including scanning and electron microscopes, can easily be several terabytes. To interact with that hi-res and high-dimensional imagery, the Immersion Lab uses VR and augmented reality (AR) and massive interactive compute power to allow researchers to “blow it up to human scale” and then grab and pull at it, says Anthony. “Many of these tools have been around for a while,” notes Talis Reks, the Immersion Lab’s VR/AR gaming technologist, but he says the applications are fresh and innovative.
The lab routinely supports deep collaborations between scientists and artists. Anthony explains that “after you start to build those capabilities for training or for data interaction, it’s a set of tools now that are broadly applicable” to art, education, design, manufacturing, and dance. That’s where Namburi fits in.
A New Tool for Communication and Instruction
Namburi fires up a demo video filmed at a nearby production studio where several dozen motion-capture cameras hang from the ceiling, pointing downwards at a wooden floor. A coach stands beside a high-heeled dancer wearing an AR headset, her back covered with motion-tracking markers. The foxtrot music begins and as the performer dances, the coach offers encouraging words: “From the legs to the spine, from the spine to the legs. Feel the rhythm. Good, good. Be yourself.”
Meanwhile, in the headset, the dancer sees herself represented as a constellation of moving lights, corresponding to the placement of the markers on her back. “This is similar to a portable mirror,” says Namburi. “She can view the relationships between different parts of her back, as if she’s always standing behind herself.” The goal is to give the dancer-coach duo a new tool to improve communication and instruction, one that provides instantaneous, visual feedback.
The AR session stayed with the dancer, who later told Namburi she continues to imagine those glowing, moving points of light in her mind as she learns new routines. “Getting immediate feedback on whether one achieved an intended action or not,” says Namburi, “seems to be the most valuable aspect of this biofeedback tool.”
From the Dance Floor to the Factory Floor
Anthony says that professional dancers, who strive “to get better at motion—more efficient, more smooth, more elegant,” have traits that can be helpful to people across a broad spectrum of activities. Namburi and his team used accelerometers to record dancers and other “motion experts,” such as athletes and martial artists, as they made simple reaching gestures with one arm. The researchers observed that the professionals moved more smoothly than the typical person. When individuals without motion training were recorded making the same gesture, they exhibited a characteristic shake and subtle tremor, undetectable to the naked eye.
A similar difference emerges among workers on a factory floor whose wrists were outfitted with accelerometers. Those with less experience tend to conduct their physical tasks not as smoothly and with more wobble. “The characteristics of efficient motion that we see in dance,” says Anthony, “help us understand what it means to be efficient or fatigued in other domains,” settings where worker safety and performance are important, for example. Namburi says these insights could help lead to changes in a factory’s environment or training protocol to reduce the risk of injury.
Although Namburi continues to take ballroom dancing lessons several hours each week, his work with the Immersion Lab has taught him “that we don’t necessarily know how to see movement.” That’s why, from time to time, he wires up and works through a routine on that dance floor ringed by motion-capture cameras. For Namburi, the lab is a tool “to interact with my dancing in another way”—with the power to teach the dancer inside each of us how to visualize and then control our own movement.
This story was originally published in the spring 2023 issue of MIT Spectrum.
Photo: Graduate student Roger Pallares Lopez views sensors placed on dancer Kaelyn Dunnell ’25 from a monitor in the MIT.nano Immersion Lab. Credit: Ken Richardson