Hit enter to search or ESC to close

How a Digital Nose Could Help Autonomous Vehicles Stay Clean

An illustration of a disembodied nose perched on the front face of a teal and pink-colored cube, surrounded by nodes and lines representing data and information, on a white background

Restoring smell to a person who’s lost the ability; identifying a new smell with the touch of a button; alerting a cleaning crew that a passenger just ate some super-smelly tuna in a ride-share vehicle – these are all possible uses for artificial intelligence-powered olfaction.

People interact with digitized sensory capabilities often, but not broadly: “sight” (camera, lidar, radar), “hearing” (microphones), and “touch” (prosthetics with feedback implants). But the AI nose is the vanguard.

The good news is, Aryballe, a Grenoble, France-based digital olfaction firm that uses biochemical sensors, optics, and machine learning to detect odor and turn it into data, is on it.

However, it is profoundly complex. Humans can distinguish more than one trillion scents, according to a 2014 study by the U.S. National Institutes of Health, published in Science magazine.

But beyond creating a digital nose, Aryballe is digitizing scents – a “Shazam for smells,” said Sam Guilaumé, the company’s Chief Executive. There are kinds of smell libraries in France, Germany, Scotland and America.

But pairing AI-enhanced detection with a library is new. “Ones and zeros are just data. I like to think we turn smell into knowledge,” Guilaumé said. “We essentially make olfaction quantifiable and objective, and enable someone to make a decision based on our information.”

The digital nose sensor. Credit: Aryballe

Why digitize the ability to smell?

Founded in 2014, Aryballe has operations in France and America, and clients in the food and beverage, consumer packaged goods, and automotive industries.

With a patient receiving the first fully digital prosthetic eye in 2021, the idea of assisting a person with total or partial loss of smell — anosmia and hyposmia, respectively — with a digital olfaction implant is plausible. It’s the purpose of the European Union-funded Rose project, in which Aryballe is a lead participant.

 “Ones and zeros are just data. I like to think we turn smell into knowledge.”

The project aims to help the 20 percent of the world’s population with some form of smell loss.

“The objective is to explore how the Aryballe sensor could help link artificial systems to human biological olfaction, and enable an anosmia patient to recall their sense of smell,” explained Guilaumé.

It matters because the tiny molecules that make up smells alert us to danger, evoke memories, trigger emotions, help with attraction, excite other senses such as taste, and even enhance the experience of sight and smell.

But AI-powered noses might have some less-obvious uses. Guilaumé talked about how an autonomous vehicle fleet operator, for example, would not have a driver to keep passengers from smoking, or eating, say, tuna. “To maintain a level of cleanliness, you need a device such as ours that can monitor odors in the cabin.”

Why is it so hard to achieve?

Humans sense smell when an object emits odor molecules that are carried through the air into our noses when we breathe. The tiny molecules activate olfactory neurons in the nose as they pass across specialized nerve cells.

These identify smells, triggering memories and evoking emotions. A human nose has several million such cells, which make up the 500 or so receptors that can identify specific odor molecules.

Aryballe uses organic chemistry that mimics the nose, explained Guilaumé. Smell molecules bind with volatile olfactive compounds on digital sensors, and they create a kind of smell image.

The image is processed through Aryballe’s knowledge base, and then machine learning makes a recommendation about its nature. This takes 10 to 20 seconds, which is about as long it takes humans to react to most smells.

The key moment in digital olfaction was in October 2004, when the Nobel Prize in Physiology or Medicine for 2004 was awarded jointly to Richard Axel and Linda Buck for their discoveries of “odorant receptors and the organization of the olfactory system.”

Essentially, Axel and Buck discovered how the brain converts scent into a signal that can be recognized, remembered, and associated with emotions. Prior to that, the industry was in the dark, said Guilaumé.

“People thought smell was a mix of gasses, and that with gas detectors, you could derive a sense of smell. But your nose can’t smell gasses such as CO2 (carbon dioxide), carbon monoxide, or NOx (nitrogen oxides),” he explained. “Our digital nose uses similar receptors to those in your nose. Whatever your nose smells, our sensor can smell.”

Indeed, for man-made products, Guilaumé said Aryballe’s technology can outperform a human nose with cheese, wine, fragrances and things that are “a few centuries old.” “We can distinguish not just between vanilla and vanillin, but between vanillin from two different producers.” However, some smells remain challenging. “It will be difficult to compete with the reptilian reflexes that prevent us eating something which is spoiled, or getting too close to something decomposing.”

What’s now? What’s next?

To classify and archive any object, image, sound, or smell, it needs labeling. Digital image labeling, or annotation, is crucial for all autonomous robotics work.

For example, for autonomous vehicles to identify a bicycle, thousands of images of different bicycles are labeled—usually by humans, but increasingly by AI—so the vehicle knows that when it detects a certain kind of two-wheeled device, it should treat it as a bicycle and not a pushcart.

It’s a similar process for smells, as Aryballe builds out a library of thousands of smells at varying levels of concentration, all of which could someday be used to make more combinations.

Challenges abound, from sourcing the smells, and recruiting experts to annotate odors, to identifying the primary components of a given scent — which is still not possible.

Guilaumé draws an analogy with the three primary colors, red, yellow, and blue. “We don’t know the primary odors, or if they even exist,” he says. “And if they do exist, we don’t know how many of them there are. But with the devices we are developing, I believe we will eventually identify them.”

In 1953, James Watson and Francis Crick discovered DNA’s twisted-ladder double helix, and only 50 years later, scientists sequenced the human genome.

So it’s not such a leap of logic that smell could be digitally codified, to be recreated, or eliminated, in the way that some headphones cancel noise. “We will be able to do this,” said Guilaumé. “The limitation for now is that this only works if you have the counter-odor ready.” Something that might come in handy for anyone riding in an autonomous vehicle after a previous rider enjoyed a particularly odiferous lunch.

Choose your lane

How Autonomous Vehicles Distinguish Between Bicycles and People Who Ride Them

How Autonomous Vehicles Distinguish Between Bikes and People

When it comes to how autonomous vehicles see the world, humans come first, literally. Autonomous vehicles (AVs), like the kind operated by Pittsburgh-based Argo AI, use Machine Learning to detect and classify the objects in their surroundings, identifying people...
Why The League of American Bicyclists is optimistic about autonomous vehicles

Why a Leading Cycling Advocacy Group Is Optimistic About Autonomous Vehicles

As autonomous vehicle use grows, AV companies and the League of American Bicyclists are collaborating on how to ensure cyclists and motorists can share the roads safely, even if the “motorist” is artificial intelligence software. As part of the...

Self-Driving Is Arriving Right On Time. Just Like Ice Cream Did

Seven years ago, I was a self-driving skeptic. Not of the technology. Of all the “experts” promising autonomous vehicles would be everywhere by 2020. You didn’t need to be Nostradamus to know that was ridiculous. All you needed was...
Illustration of a futuristic parking deck turned into a mixed-use space, with AVs driving by

How Autonomous Vehicles Could Help Transform Parking Lots

Researchers say it’s likely that autonomous vehicles (AVs) can help reduce the need for parking lots, opening more room for grass and trees and other elements of nature. It may not seem like it when you’re circling the block...
An illustration of an Argo autonomous vehicle in teal against a backdrop of buildings, a bicyclist, and research papers

7 Big Breakthroughs From Argo Research at CVPR 2022

The 2022 Conference on Computer Vision and Pattern Recognition (CVPR 2022) is nearly here. Thousands of computer scientists, software engineers, and researchers from around the globe will gather in New Orleans to review and discuss their latest work in...

Researchers Predict the Future With Lidar Data

Researchers at Carnegie Mellon University’s Argo AI Center for Autonomous Vehicle Research, a private-public partnership funded by Argo for advancing the autonomous-vehicle (AV) field, say they have come up with a way to use lidar data to visualize not...

Must Reads