A new kind of camera promises better visuals for self-driving cars, drones and other robots

This image was captured by a new camera that can focus on different objects. Normal-light version shows what the human eye sees. Red and blue versions offer clues to distance.
Red objects are closer, blue ones farther, mid-range ones appear purple.With any luck, the future will bring more self-driving cars and flying drones to deliver pizza and other goodies. For those robots to get around safely, though, they need to both see their surroundings and understand what they’re seeing. A new kind of camera developed by engineers in California may help them do just that. It sees more than what meets our eyes.
The new camera combines two powerful traits. First, it takes exceptionally wide images. Second, it collects data about all the light bouncing around the scene. Then, an onboard computer uses those data to quickly analyze what the camera sees. It can calculate the distance to something in the picture, for example. Or it can refocus a specific spot within the image.
Such calculations would help self-driving cars or drones better recognize what’s around them. What kinds of things? These might include other vehicles, obstacles, intersections and pedestrians. The technology could be used to build cameras that help their host vehicle make faster decisions — and use less power — than do the cameras on drones and vehicles now. A car might then use those data to navigate more safely.
Self-driving or –flying vehicles “have to make decisions quickly,” notes Donald Dansereau. He’s an electrical engineer at Stanford University, in Palo Alto, Calif. Along with scientists there and at the University of California San Diego, he helped build the new camera. That team presented its new camera at a conference in July.
The big picture
As with other cameras, this one uses lenses. Lenses are transparent, so they let light through. And they’re curved, so they can bend the rays of incoming light. For example, the lens in the human eye focuses light on sensitive receptor cells at the back of the eye.
The new camera uses a spherical lens. It’s about the size of a wild cherry and made of two concentric shells of glass. Behind it sits a grid of 200,000 microlenses. Each of these is smaller than the width of the finest hair.
Here’s how they work together.

The spherical lens allows the camera to capture light across a wide field of view. This is what can be seen through a lens without moving it. If you could see everything that surrounds you at once, you would have a 360-degree field of view — a full circle. A person’s eye has a field of view of about 55 degrees, or some one-sixth of a full circle. (To see this, cover one eye and look straight ahead. What you can see without moving your eye or head is your eye’s field of view.)
The new camera is mounted on a rotating arm. This gives it a field of view of 140 degrees, or almost half a circle. That means it can see nearly everything in front of it.
To capture information about a scene, the innovative camera uses a technology called light field photography. The term “light field” refers to all of the light in a scene. Light reaches a camera lens only after it bounces off one or more objects in the scene. To imagine this, think of light particles, called photons, as pinballs that race through a pinball machine (bouncing off of obstacles in the machine). When a photon reaches a microlens, a sensor within the lens detects it and records both the location of the light and the direction it came from.
Because there are so many microlenses in the grid, the camera can capture detailed data about the light. The onboard computer can turn those data into information about where things are, and how they’re moving — similar to how the brain might work as its host walks around a room.
The camera can use that information in many ways. A self-driving car, for example, could use it to “see” nearby cars, and avoid them. It could focus on one specific car (like one in front) or another (like one coming toward…
The post Seeing the world through a robot’s eyes appeared first on FeedBox.