Hit enter to search or ESC to close
Technology

Let There Be Light: How a Single Photon Could Revolutionize Self-Driving

Scenes of Argo Lidar development from Argo’s engineering lab in Cranbury, New Jersey.

If you’re reading this article on a screen, it’s mostly thanks to photons. About half a billion of these tiny particles that make up light enter your eyes every second. Photons were first theorized by Albert Einstein back in 1905, but it took another century-plus before the human eye was proven to be sensitive enough to detect a single photon—a feat incredibly difficult for cameras and other sensors to replicate.

Now that amazing capability—single-photon sensitivity—is coming to self-driving cars through breakthrough Geiger-mode lidar technology developed by Argo AI. Argo Lidar can detect a single photon while moving down the road, a major advancement compared to traditional automotive lidar that needs to detect dozens or hundreds of photons in order to perceive surrounding objects. 

Lidar works, roughly speaking, by beaming laser pulses at physical objects, and then measuring the time it takes them to bounce back. With the ability to capture the smallest measurable unit of light, a photon, Argo Lidar can detect darker, less reflective objects from a longer distance and with more precision than other lidars currently available. 

Take, for instance, black-painted cars, which constitute 25% of vehicles on U.S. roads. With reflectivity as low as 1%, black vehicles are far harder for traditional lidar sensors to detect than white cars, which boast 80% reflectivity. Argo Lidar’s implementation of single photon sensitivity allows it to detect black cars at double the distance of any lidar currently available. 

Argo Lidar’s single photon detection—combined with higher wavelength operation, above 1400 nanometers—gives Argo Lidar its unique capabilities, including long range of 400 meters, higher resolution, lower reflectivity detection, and gapless imaging. This results in a sensor that enables safer self-driving, both at highway speeds as well as in dense city environments.

Into the Light

The journey to Argo Lidar began back in 2001, 15 years before Argo AI was founded. That’s when Mark Itzler was working for one of the world’s largest telecommunications hardware component manufacturers. Itzler was approached by a client that wanted to purchase a limited number of avalanche photodiodes (APDs), highly sensitive semiconductor devices that can detect single photons of light traveling over fiber optic cables, for its work using quantum physics to encrypt sensitive information. 

The quantity of APDs ordered was so small that it barely registered as a blip in his company’s sales, but for Itzler, the idea of using APDs for single-photon detection was astonishing. “You’d have to do something with our APD that it wasn’t necessarily intended to do,” he says. “It’s like taking a drug that was developed for one disease, trying it on another one, and discovering, ‘Hey, this is curing something nobody ever intended it to cure.’”

When Itzler later joined Princeton Lightwave, a small engineering outfit in Cranbury, New Jersey, specializing in laser and imaging technologies, he brought a budding curiosity about single-photon detection along with him. The main question at the time was: Where else might single-photon detection be applied? 

At the time, there were only a handful of companies experimenting with the technology. There was a company building large metrology instruments for tech giants like Intel and IBM, using single-photon detection as part of an integrated circuit diagnostics technique—essentially using highly sensitive imaging to test the output of microprocessor chips. And there were also a handful of biomedical companies exploring single-photon detection using APDs based on silicon. “They were focused on these biological fluorescence techniques, looking for evidence of certain molecular markers in biological testing,” says Itzler. 

By contrast, Princeton Lightwave’s APD was a compound semiconductor built around a different material: indium phosphide, less common and more difficult to manipulate than silicon, but capable of operating at longer wavelengths. This material proved better suited to fiber optic transmission for telecom, circuit diagnostics, and, as Itzler would soon discover, for lidar. 

Military Grade 

In late 2007, Princeton Lightwave’s avalanche photodiode and imaging technologies came together in a project initiated by MIT’s Lincoln Lab. The federally funded lab had been working to create arrays—cameras that used single-photon APDs as the pixels—for the defense industry, and Itzler’s team was tasked with commercializing the technology. This led to the first demonstration of so-called “Geiger-mode” APDs used for lidar, and, for Princeton Lightwave, the first sizable contract from DARPA, the Department of Defense’s research and development arm.

The term “Geiger-mode” was used loosely inside defense-technology circles as a nod to Geiger counters, which measure a single radioactive particle at a time. DARPA was mostly interested in employing Geiger-mode lidar on airborne platforms—planes, jets, helicopters—to identify and classify faraway objects, or to map the topography of enormous regions.

Princeton Lightwave’s indium phosphide APDs were interesting for military applications because they allowed for the creation of lidar systems that detected longer wavelengths of light, from 1400 to 1600 nanometers, compared to traditional APDs that detected light with wavelengths less than 950 nanometers. In addition to the higher sensitivity and range that it offered and additional defensive benefits, longer wavelength lidar is safer for the eyes because wavelengths beyond 1400 nanometers are less likely to damage the retina. 

Soon, other pockets of the defense community started to take note of what Princeton Lightwave was doing. “By 2010, as we worked on that development with DARPA, I started getting contacted by all of these folks in the defense community, saying, ‘Hey, by the way, we’re really excited you guys are developing this technology, and we’re willing to pay good money for it,’” remembers Itzler.

And for a time, they did. But it didn’t take long for Itzler and his team to realize that there were only so many military applications that would utilize Geiger-mode lidar, and that it might make sense to start looking for bigger markets. So in the mid 2010s, Princeton Lightwave assembled a small skunkworks team to start exploring the possibility of combining in-house laser technology with single-photon APDs to create a ground-based lidar system of their own. 

Laser Focused

When Sam Wilton, a recent Penn State Engineering grad, joined the nascent lidar team in 2014, they were still a long way off from developing a working lidar prototype. “It just hadn’t really been done before,” says Wilton. 

He got his first introduction to single-photon detection when his engineering manager, Mark Entwistle, took him aside to whiteboard the theory behind Geiger-mode. “Mark [Entwistle] told me, ‘We have a spare camera and a laser sitting in the lab somewhere.’ So I spent about a month putting together the first lidar prototype.” 

Wilton began tinkering with cameras and developed some software to process and visualize lidar data. In addition to reading a constant stream of material related to the technology, he also had a little green lidar handbook with certain range equations and theoretical limits of performance. This was the extent of his formal training on the technology, and not surprisingly, the trial-and-error stage of development took years.

“There are infinite combinations of lasers, sensors, signal processing algorithms, and scanners that you could use,” Wilton says. “Do you rotate the system? Do you use some kind of oscillating scanner, like a voice coil mirror? Do you use a static flash system where you flood-illuminate the scene? How does changing lens aperture size, zoom, and focus affect the data? How about increasing and decreasing laser power? How do you filter out background solar light?” 

And then there were all of the software considerations: how to process and visually represent the data. How to use software to make the lidar see farther and improve the data quality. 

Even as Wilton, Igor Kudryashov, and other teammates were making progress on a lidar designed for automotive applications, Princeton Lightwave continued exploring uses for its early in-house Geiger-mode lidar prototypes in other applications. After all, in 2014 the market for automotive lidar still seemed a long way off. But Itzler, Wilton, and their colleagues remained convinced that their longer-range technology would give them a competitive advantage…someday. It wasn’t until an afternoon in mid 2014 that they managed to demonstrate what they’d discovered. 

Guided by Lidar 

An early Geiger-mode lidar sensor developed by Princeton Lightwave.

At the time, the larger players in the lidar space could only create accurate 3D lidar point clouds at distances up to 100 meters—not nearly enough to enable a vehicle to drive safely at highway speeds. And, until that point, all of Princeton Lightwave’s datasets were taken with a static camera imaging at a relatively short range (a 2 to 8 degree field of view, with less than 60 meter range). 

But the team was convinced that their system was actually capable of detecting objects at much greater distances, as high as 150 meters. So they loaded a prototype sensor—dubbed “PLIDAR Mk 2”—into the back of a car. Then they drove it to an open field 15 minutes from their office, pointed it in the general direction of the tree line they’d measured to be 150 meters away, and turned on the system. 

The point cloud they generated on their computers—an array of objects, illuminated by dots, off of which the laser had bounced—revealed a detailed, high-resolution scene of lane markers, telephone poles, trash cans, buildings, trees, pedestrians, and cars. In other words—an accurate picture of the landscape. “When we collected that data set and everybody saw that specific point cloud, I think that was the moment that really captured everyone’s imaginations,” says Wilton. “The opportunity was clearly there.” 

The point cloud and lidar intensity image that proved to Princeton Lightwave engineer Sam Wilton that “the opportunity was clearly there.”

With this demonstration in hand, PLI’s leadership approached potential automotive customers with their Geiger-mode lidar technology. After several fits and starts, including a failed pilot for a Swedish auto supplier, Princeton Lightwave secured a contract with a global automaker, promising to deliver a Geiger-mode lidar system capable of capturing data at distances of 300 meters by 2017. They delivered on time and in full—but the field of view was too narrow, and the system prototype was physically too large for widespread adoption aboard passenger vehicles. Still, this success, coupled with the rapid expansion of the automotive lidar market, proved to Itzler that it was time to go all-in developing and selling their Geiger-mode lidar technology. 

At the same time, the team started developing what they dubbed “GeigerRay” lidar, their vision for what they actually wanted to build, exempt from customer restrictions, using oscillating scanners that allowed them to compress their field of view, increasing their signal-to-noise ratio by focusing the laser light into a smaller region of space. As far as Wilton was concerned, their new system, which came together in a matter of a months and was capable of generating a point cloud within 1/24th of a second with unparalleled resolution at distances of up to 250 meters, could go toe-to-toe (and win) against any other lidar system on the market. 

Suddenly, the Princeton Lightwave team was inundated with meetings with potential customers. “I don’t even remember all of the companies that came in for demos,” Wilton says.

One of their final meetings was with the team from Argo AI, the Pittsburgh-based self-driving technology company that had previously outsourced its lidar development to other industry players. On this particular day in 2017, Wilton recalls, their system decided to act up. The laser wasn’t syncing up properly with the camera, undermining the range precision. But Argo’s executives were quick to notice the glitch, pointing out the unexpected fuzziness—the first indication to Itzler and Wilton that their counterparts really knew what they were talking about. 

After the demo, Argo’s team requested additional datasets: some for objects at longer ranges, others at shorter ranges, one with the front window obscured with butter, or water, or dirt. Roughly five months later, Princeton Lightwave’s hard work paid off when Argo officially acquired the company, and their Geiger-mode lidar technology. Now, Itzler, Wilton, and their former Princeton Lightwave colleagues would be working full-time to bring Argo Lidar, utilizing Geiger-mode lidar technology, to Argo’s self-driving systems.

The Road Ahead 

Suddenly, the Princeton Lightwave vets were no longer designing a system for a hypothetical vehicle in a garage, they were leading a several dozen person team building lidar systems for a real self-driving fleet, gathering real driving data in multiple cities across the country. Scaling wasn’t without its challenges, but in becoming a part of Argo, they had the chance to see their technology, developed night and day for more than four years, potentially reach a mass audience. “To go from putting a bunch of scrap components together in a garage to looking out on the team at Argo and seeing this orchestra of things happening, it’s really amazing,” says Wilton.

Since the acquisition, the Argo Lidar team has continued to build upon Princeton Lightwave’s technological foundation to create the first Geiger-mode lidar system to be used on a self-driving vehicle. In addition, Argo Lidar offers the long range detection—up to 400 meters for objects of 10% reflectivity. It also boasts gapless imaging, which provides space-filling illumination of everything in the scene to avoid missing any information about objects within range of the sensor. This produces a richer image that helps to better define the outline of any given object for precise classification. Watch this video to see Argo Lidar’s photorealistic quality:

For Itzler, who first considered the potential of single-photon detectors for telecom quantum cryptography, and has weathered massive industry shifts before, it feels like a full-circle moment. “I experienced the hyper-growth and temporary collapse of one industry—fiber optic telecom—which eventually stabilized to become a critical tech business, and that was an incredible experience. We’re just starting to see what Geiger-mode lidar will do for autonomous driving. Similarly, there are lots of unknowns, but I’m excited about the road ahead.”

Must Reads