New Camera System Creates 3D Images From Kilometer Away

Posted on April 4, 2013

A new camera system can create 3D images from up to a kilometer away. To get 3D information scientists bounce a laser beam off an object and measure how long it takes the light to travel back to a detector. This technique is called time-of-flight (ToF) and it is already used in navigation systems for autonomous vehicles. A research team, led by Gerald Buller, a professor at Heriot-Watt University in Edinburgh, Scotland, have developed a ToF imaging system that can gather high-resolution, 3D information about objects from up to a kilometer away.

The new system works by sweeping a low-power infrared laser beam rapidly over an object. It then records, pixel-by-pixel, the round-trip flight time of the photons in the beam as they bounce off the object and arrive back at the source. The left-hand panels in the image above show close-up photos taken with a standard camera. The center images are 3D images of these targets taken by the scanner from 325 meters away. The data on the right is a color-coded map showing the number of photons that bounced off the targets and returned to the detector, with black indicating a low number of photons. The researchers note that human skin does not show up well using the scanner. The mannequin's face includes depth information, but the person's face does not.

Dr. Aongus McCarthy, research fellow at Heriot-Watt University, said in the announcement, "Our approach gives a low-power route to the depth imaging of ordinary, small targets at very long range. Whilst it is possible that other depth-ranging techniques will match or out-perform some characteristics of these measurements, this single-photon counting approach gives a unique trade-off between depth resolution, range, data-acquisition time, and laser-power levels."

The 3D images below show images of two of the Optics Express authors, taken in daylight from 910 meters away. Each standard photograph shows a close-up view of what the scanner sees. The middle left panels show 3D images with slightly more depth detail than the right-hand panels. The researchers say this is because the detector spent more time collecting the returning photons for the images on the left than on the right.

The research was published here in Optics Express.


More from Science Space & Robots

  • Boston Dynamics Teases New Electric Atlas Humanoid Robot


  • Researchers Observe Many New Species on Seamounts Off Chile Coast


  • CSU Researchers Forecast Extremely Active Atlantic Hurricane Season


  • Hyundai Motor and Kia Unveil DAL-e Delivery Robot


  • H5N1 Discovered at Texas Egg Facility



  • Latest Tech Products

  • Apple Mac Mini with M4 Chip
  • Apple iPad Mini A17 Pro