New Eyes for Computer Vision

STEX25 Startup:
January 20, 2023 - December 31, 2024

Ubicept's single-photon detection software provides strikingly good images in difficult lighting conditions where conventional cameras fail.

By: Eric Bender

The global quest for autonomic vehicles faces one huge barrier: These vehicles depend on cameras that just don't work well in difficult lighting conditions. Ubicept, a STEX25 firm, takes on that challenge with a radical new imaging technique called single-photon detection.

Operating at extremely high speeds, single-photon imaging systems detect individual light particles directly and can circumvent the limitations of conventional cameras, says Sebastian Bauer, Ubicept co-founder and chief executive officer.

Single-photon detection offers the potential to vastly improve current imaging systems for vehicles, security systems and other uses. Down the road the technology offers the promise of dramatic new capabilities such as the ability to see around corners, Bauer says.

“This is a paradigm shift from how detectors and imaging systems work today,” declares Tristan Swedish, Ubicept’s co-founder and chief technology officer. “We're at this huge shift in the way imaging works.”

A decade ago, single-photon detection hardware was one-of-a-kind, extremely expensive lab equipment. Today, such hardware is embedded in iPhones, and it's rapidly going down in price and up in resolution.

If your company cares about visual perception, our technology has the potential to be really disruptive

Founded in 2021, Ubicept came out of stealth mode at the 2023 Consumer Electronics Show, presenting videos with startling improvements over those taken by conventional cameras. The startup is building alliances with automotive suppliers and other partners. “If your company cares about visual perception, our technology has the potential to be really disruptive,” Swedish says.

Following each photon

You can think of conventional imaging sensors as an array of tiny buckets that collect raindrops symbolizing photons, Bauer suggests. “Over time, a certain volume of water accumulates in each bucket, and that volume is read out by the electronics,” he says. “When it's super-bright, many photons fall into the bucket. When it's not very bright, very few photons accumulate. And this system works well in conventional daylight conditions.”

But under extremely bright conditions, so many photon raindrops fall into each bucket that the bucket overflows and all it shows is white. If the environment is too dark, the bucket doesn't accumulate enough light to read out. And if an object is moving too quickly, its raindrops scatter into buckets across the array so the object blurs. These are fundamental shortcomings in today's cameras, says Bauer.

“But if you timetag each individual raindrop, assigning it a time interval that can go below one nanosecond, you can process that data and correct for motion,” he says. “You can play all kinds of tricks to come up with the best image capability possible.”

Ubicept builds on research performed more than a decade ago in the lab of Ramesh Raskar, MIT associate professor of media arts and sciences. Raskar, his then-postdoc Andreas Velten and their colleagues built a camera that could see light at a trillion frames per second.

At that time resolution, the camera could display a laser pulse traveling through a Coke bottle, says Swedish. Even more surprisingly, the researchers showed that this extremely highspeed imaging could see objects around corners, by picking up photons reflected from the objects.

These stunning results brought significant research funding from the Defense Advanced Research Projects Agency.

“Now we have arrays of these pixels that look like conventional image sensors,” Swedish says. “They can be produced at scale, with largely the same technology, at foundries that produce image sensors today, with cost curves that match those of conventional image sensors.”

Reimaging imaging

Bauer and Swedish both worked on single-photon detection, Bauer as a postdoctoral researcher in the Velten lab at the University of Wisconsin/Madison and Swedish earning his PhD in Raskar's MIT group. When the iPhone 12 appeared, they decided that the technology was ready for commercialization. “Our single-photon detection algorithms were getting good enough, and the hardware was getting to the right price points,” Swedish explains.

Everything that helps in perceiving your environment better makes the process easier, and enables us to come up with safer, more reliable vehicles

Ubicept raised pre-seed financing in 2021 and expects to raise its first seed round this year. Based in Boston, it currently employs eight.

In the early months while the startup remained in stealth mode, most of its partners were struggling with autonomic or advanced driver assistance systems for cars, trucks, helicopters, and other vehicles that operate day and night in all weather conditions.

“Visual perception in these uncontrolled environments is really, really challenging, and operating reliably in all these conditions turns out to be a huge pain point for our partners,” says Bauer. “Automotive cameras are surprisingly bad.”

At the Consumer Electronic Show, Ubicept's videos dramatically revealed how well its system performed compared to a conventional automotive camera. “For instance, we captured a video where we imaged pedestrians in the dark that were standing in front of a bus stop,” he says. “The automotive camera failed to sense these pedestrians in the dark, whereas we could see them clearly.”

“That's a huge safety issue,” he emphasizes. “Everything that helps in perceiving your environment better makes the process easier, and enables us to come up with safer, more reliable vehicles.”

Ubicept has developed alliances with leading developers of single-photon detectors. The firm is creating evaluation kits for partners that couple its software with reference detection hardware that employs off-the-shelf components.

Importantly, Ubicept's software can directly process each photon gathered by the hardware. “With our technology, we are able to resolve at a certain level today,” Swedish says. “But with a software update, we can do even better in the future without replacing the actual physical hardware.”

We are super excited to lead this paradigm change in imaging

The firm is focusing on putting its software on as many sensor platforms as possible, as those devices are packaged into more compact form factors and scaled up for production. Ubicept can develop enhancements “at the speed of software, which is much faster than hardware,” says Swedish.

“Right now, we're the only ones doing this kind of processing with single-photon sensors,” Bauer says. “But obviously, many companies will join that. Companies that make image signal processing chips for automotive cameras, for example, will swap over from conventional sensors to these new kinds of sensors.”

“We are super excited to lead this paradigm change in imaging,” he says. “We're looking forward to working with many more partners and other stakeholders that join us in creating and riding this wave.”