An MIT Alumni Association Publication

HDR Pioneer Looks to Future of Wearable Tech

  • Kathryn M. O'Neill
  • Slice of MIT
  • 1

Filed Under

Ever in the vanguard, Steve Mann PhD ’97 is continually inventing not only new technologies but new concepts and philosophies—along with the language to describe them. Best known as the father of wearable computing, Mann’s focus today is “mersivity”—advancing technology that connects people with the physical world.

Mann invented digital eyeglasses in 1984, decades before Google Glass came out in 2013, and a wristwatch videophone in 1998, 16 years ahead of the Apple Watch. Among techies, he is perhaps best known for pioneering high-dynamic-range (HDR) imaging, which enables many cameras today—including iPhone cameras—to produce better pictures, particularly in poor lighting. A professor of electrical and computer engineering at the University of Toronto, Mann also invented the hydraulophone, a musical instrument that forms notes using liquid; visitors can play one at many museum, gallery, and waterpark spaces around the world.

What’s common to all this work, Mann says, is a desire to employ technology to enhance the human experience while allowing users to remain immersed in the world around them. Pointing out that most technology today works best in dry, controlled environments, Mann says he wants to bend technology to the needs of humans to improve their lives without disruption.

“Computers force us to twist ourselves like a pretzel around the computer. We should be able to stand tall and let computers twist around us,” says Mann, who has been wearing digitally enhanced eyewear almost continuously since the 1970s. “I hate technologies that swallow us up and cut us off from the world.”

"The goal is ... technology that connects us to the real physical natural world around us," Mann says.

Steve Mann is shown barechested wearing digitally enhanced goggles and a swim cap. Ice floats in the water behind him and snow is visible.

As an alternative, Mann strives for “mersivity”—a quality he describes as the sweet middle spot in a Venn diagram connecting humanity, nature, and technology. A thing is reciprocally “mersive,” Mann explains, if you can immerse it, and it can immerse you. “The goal is to advance technology for humanity and the environment: technology that connects us to the real physical natural world around us rather than isolates us from it,” says Mann, who introduced this concept to the Institute of Electrical and Electronics Engineers at a conference in 2022.

Citing an example, Mann explains that his computerized eyewear is designed to improve vision—such as by digitally enhancing text, revealing heat signatures, or providing facial recognition—rather than to clutter the scene with overlays, the typical approach of today’s augmented reality glasses.

Childhood Tinkerer

Mann started tinkering early in life, initially inspired by learning to weld from his grandfather at the tender age of four. Very little could be seen through the dark glass of the welding helmet, and Mann wanted to see more. “I had this vision of a welding helmet that uses video to see the world,” he says. That vision ultimately led to Mann’s 1993 development of HDR, which combines different photographic exposures of the same subject matter to produce an optimal image.

He earned undergraduate degrees in physics and electrical engineering from McMaster University in his native Ontario, then came to MIT in 1991. While earning his PhD in media arts and sciences, Mann founded MIT’s Wearable Computing Project: At one point, he erected an antenna for wireless computing on top of the Media Lab so he could control a camera from afar and take selfies. “I created probably the first cyborg community wireless,” he says.

In 1998, Mann joined the faculty at the University of Toronto, and his research has led to hundreds of publications—including a book (with Hal Niedzviecki) called Cyborg: Digital Destiny and Human Possibility in the Age of the Wearable Computer (Doubleday of Canada, 2001)—and numerous patents (including one for an HDR welding helmet). In 2006, he cofounded InteraXon, which makes a brain-sensing headband called Muse. He also cofounded and serves as CTO of Blueberry X Technologies, which launched in 2019 to commercialize bio-sensing eyeglasses.

Assistive Devices and Underwater Tech

Mann’s latest goal is to build a self-driving walker for the visually impaired—essentially a techie replacement for the seeing eye dog. Ever one to coin a term, Mann says he is working on “freehicles—vehicles of freedom for mobility.”

A lifelong swimmer, Mann has also taken a deep dive into water-centered technology. He developed an advanced underwater heart monitor and augmented reality biofeedback system for swimmers, and he created virtual reality goggles that work under water—allowing humans to reach that sweet spot of mersivity: “I’m immersed in nature; technology is around me; and the natural world is around the technology.”

Achieving this vision more broadly calls for an expansive new approach to the use of technology—and new concepts. As Mann says, “Mersivity is the next frontier.”


Top image: Steve Mann is at the far left in a selfie taken wirelessly with other MIT students in the mid-1990s. 

Inset image: Steve Mann by Lake Ontario wearing digitally enhanced goggles.

Photos: Courtesy of Steve Mann

Filed Under

Comments

Claudio Pinhanez

Thu, 10/05/2023 9:20pm

Lee Campbell, Jim Davis, and Baback Mogadhan (2nd, 4th, and 5th in the photo, left to right) were not members of the Wearable Computing project. They were doing Computer Vision at the time and just helped taking the photo by posing with Steve and Thad's gear.

And, BTW, Thad Starner (the 6th) deserves at least as much credit as Steve Mann for the pioneering work on Wearable Computing at the MIT Media Lab. The 4th, if I remember, was a UROP working with Thad.