Hit enter to search or ESC to close

Meet the Quantum Physicist Ballerina Who Programs Her Own Robot Dance Partner

An industrial robotic arm is an unusual choice for a dance partner, but it was the one sought after by Dr. Merritt Moore, a former dancer with the Norwegian National Ballet.

Moore is also a quantum physicist and graduated with a PhD in atomic and laser physics from the University of Oxford in 2018. She’s published papers on quantum information processing, quantum inference, and quantum electronics. She’s also danced for the Boston Ballet, the Zurich Ballet, and the English National Ballet.

For years, Moore kept her two demanding careers separate, but that changed when Moore met Silje Gabrielsen a few years ago through a mutual friend in Oslo.

Gabrielsen is a designer and the co-founder of Hiro Futures, a human-robot interaction start-up researching artificial social skills in robots. Moore can’t pinpoint one conversation in particular, but it wasn’t long before Gabrielsen gave Moore the opportunity to explore movement with a robotic arm.

As a direct result of her work with Gabrielsen’s robot, Moore landed the Harvard ArtLab residency just a few months shy of the COVID-19 pandemic. Fascinated by human-machine interaction, her goal was to push a robotic arm to mimic human body language and gestures as closely as possible.

Finding a new rhythm

After six weeks of programming and rehearsal, Moore was beginning to find her rhythm, but she knew there was still so much more to explore and concern over the spread of the virus reached crisis-level. Moore’s options for human dance partners dwindled, inspiring her to ask Universal Robots to lend her a robotic arm. “I thought, at least robots can’t get Covid,” she says.  She dubbed the robotic arm “Baryshnibot.”

Moore pinpoints the pandemic as a major turning point for her progress. She and the robot began to experiment more expansively with different dance styles–hip-hop, salsa, contemporary, reggaeton. Moore posted their dance routines on social media under the handle @physicsonpointe, garnering hundreds of thousands of views. Despite online success, Moore says she didn’t really feel like she “made this thing that has no legs and no arms look like a human” until the night Moore and Baryshibot performed a Michael Jackson inspired dance routine on a shadowy empty stage.

Moore recorded the performance and posted it on Instagram. Her machine was programmed to match hip thrust for hip thrust to the tune of “Billie Jean.” Both Moore and her robot were dressed “Smooth Criminal” style, with matching fedoras and all. Fire and heart emojis quickly poured in–one commenter said, “Michael Jackson would have loved it.”

If you follow Moore’s videos from the beginning, the difference in her connection and ability to communicate with the robot as she progresses is striking. The evolution of the robot’s fine motor skills is obvious, and Moore describes the level of detail she aims for as a new world for some of her peers in the industrial robotics space.

She talks about staying “that extra seven hours to make sure every angle and all the timing is exactly right.”  Not only did Moore push the boundaries of what an industrial robot could do, she pushed herself to a new level of artistry by honing her own connection to the robot.

Moore’s robot is not specifically designed to dance, but she describes the machine as moving with “impressive fluidity.”

The interplay between human and machine

Baryshnibot is an industrial robotic arm, a cobot or collaborative robot meant to work with humans. It can be programmed to do repetitive movements or to identify and transport packages using AI. Moore sees her communication with Baryshnibot as two-way: Moore programs the robot to adapt to human gesture and in turn, Moore becomes a better dancer by learning the robot’s movements.

Moore feels her experience as a dancer allows her to ask better questions, especially when it comes to human-machine interaction. When Moore dances with her robot, she uses a VR handset or a device called a teach pendant. A teach pendant is essentially an iPad she uses to sequence the robot’s velocity according to the music. Then Moore learns the choreography she sequenced for the robot.

Moore is pushing robotics companies to reimagine the questions they ask on the path to innovation. Universal Robots originally refused Moore’s request for a robot. “They said that the market for dancing robots is zero,” Moore says.

When they eventually relented, Moore’s videos dancing with Baryshibot far outperformed videos of the same robot performing industrial tasks. Universal Robots was asking questions about market size, not human engagement. Moore, however, was far more interested in the impact of humanizing the robot.

Moore boldly challenges stale academic exploration. She says, “I was in quantum optics, and if someone published a paper with two photons, the next person would publish it with three photons. No one was willing to go back to the drawing board and start the experiment over.”

Moore thinks that questions with easy answers stifle creativity.

“Our greatest way of contributing is to ask good questions,” she says “In academics, we’re taught to answer questions, but Siri and Alexa could answer all the questions really fast, right? I think that our role in research and continuing forward is asking the right questions to get amazing breakthroughs. And art asks good questions about where the research is going.”

In Moore’s case, that means asking questions about how to build trust and improve communication between humans and robots.

Art meets science

Creating a sense of trust is one of the most important aspects of moving autonomous technology forward. This is where Moore believes art can help create machines that are intuitive and playful, bridging the gap between autonomous technology and people.

”I think my work is breaking down the danger barrier for a lot of people,” Moore says. The fact that it’s choreographed precisely allows people to have that trust. And it’s quite elegant. Subliminally, I think that allows people to feel a sense of familiarity. They’re familiar with a certain combo, or the construct of partner dancing. And then you bring the robot into that familiar space? I think it limits that fear factor.”

After all, who could resist a robot wearing a tiny fedora?

Right now, Moore believes there is a language barrier between art and science that prevents us from thinking about innovation as playful and intuitive. Moore feels as though she speaks two different languages, she says, “What gives me leverage is that I can program the robot and create the dance. I get to go to the robot companies and ask questions that they’ve not explored yet because it’s what I want artistically.”

After two years of experimentation, dancing with Baryshnibot is Moore’s main source of income. “It’s a big part of my focus. It’s multifaceted,” she says. Performances, content creation, research, speaking engagements, and collaborations on and off social media support her continued exploration of human-robot interaction through dance.

This exploration allows her to make unexpected connections, asking new questions that can take innovation in a new direction. Sometimes, we associate artificial intelligence and autonomous technology with rigid sorting and categorization.

Moore is a living manifestation of the opposite. She cannot be neatly sorted. Art and science are two ways of knowing and one without the other is incomplete. “The best physicists I know are really open to art and literature,” she says. Art allows us to imagine, to ask less obvious questions, and to keep our humanity in mind as we design.

Choose your lane

How Autonomous Vehicles Distinguish Between Bicycles and People Who Ride Them

How Autonomous Vehicles Distinguish Between Bikes and People

When it comes to how autonomous vehicles see the world, humans come first, literally. Autonomous vehicles (AVs), like the kind operated by Pittsburgh-based Argo AI, use Machine Learning to detect and classify the objects in their surroundings, identifying people...
Why The League of American Bicyclists is optimistic about autonomous vehicles

Why a Leading Cycling Advocacy Group Is Optimistic About Autonomous Vehicles

As autonomous vehicle use grows, AV companies and the League of American Bicyclists are collaborating on how to ensure cyclists and motorists can share the roads safely, even if the “motorist” is artificial intelligence software. As part of the...

Self-Driving Is Arriving Right On Time. Just Like Ice Cream Did

Seven years ago, I was a self-driving skeptic. Not of the technology. Of all the “experts” promising autonomous vehicles would be everywhere by 2020. You didn’t need to be Nostradamus to know that was ridiculous. All you needed was...
Illustration of a futuristic parking deck turned into a mixed-use space, with AVs driving by

How Autonomous Vehicles Could Help Transform Parking Lots

Researchers say it’s likely that autonomous vehicles (AVs) can help reduce the need for parking lots, opening more room for grass and trees and other elements of nature. It may not seem like it when you’re circling the block...
An illustration of an Argo autonomous vehicle in teal against a backdrop of buildings, a bicyclist, and research papers

7 Big Breakthroughs From Argo Research at CVPR 2022

The 2022 Conference on Computer Vision and Pattern Recognition (CVPR 2022) is nearly here. Thousands of computer scientists, software engineers, and researchers from around the globe will gather in New Orleans to review and discuss their latest work in...

Researchers Predict the Future With Lidar Data

Researchers at Carnegie Mellon University’s Argo AI Center for Autonomous Vehicle Research, a private-public partnership funded by Argo for advancing the autonomous-vehicle (AV) field, say they have come up with a way to use lidar data to visualize not...

Must Reads