Friday, September 10, 2010

Portable Laser Backpack Revolutionizes 3D Mapping

Portable Laser Backpack Revolutionizes 3D Mapping: A portable, laser backpack for 3D mapping has been developed at the University of California, Berkeley, where it is being hailed as a breakthrough technology capable of producing fast, automatic and realistic 3D mapping of difficult interior environments.

Research leading to the development of the reconnoitering backpack, was funded by the Air Force Office of Scientific Research and the Army Research Office under the guidance of program managers, Dr. Jon Sjogren (AFOSR) and Dr. John Lavery (ARO).

The backpack is the first of a series of similar systems to work without being strapped to a robot or attached to a cart. At the same time, its data acquisition speed is very fast, as it collects the data while the human operator is walking; this is in contrast with existing systems in which the data is painstakingly collected in a stop and go fashion, resulting in days and weeks of data acquisition time.

Using this technology, Air Force personnel will be able to collectively view the interior of modeled buildings and interact over a network in order to achieve military goals like mission planning.

Under the direction of Dr. Avideh Zakhor, lead researcher and UC Berkeley professor of electrical engineering, the scientists have been able to use this more portable method of mapping by way of sensors or lightweight (less than eight ounces) laser scanners.

"We have also developed novel sensor fusion algorithms that use cameras, lasers range finders and inertial measurement units to generate a textured, photo-realistic, 3D model that can operate without GPS input and that is a big challenge," said Zakhor.

There are many basic research issues to achieve a working system, including calibration, sensor registration and localization. Using multiple sensors facilitates the modeling process, though the data from various sensors do need to be registered and precisely fused with each other in order to result in coherent, aligned, and textured 3D models. Localization is another technical challenge since without it; it is not possible to line up scans from laser scanners in order to build the 3D point cloud, which is the first step in the modeling process.

"It is fair to say that embarking on such a hands-on project, to make indoor 3D modeling a matter of routine, a number of research questions of a fundamental nature came up," said Sjogren. "It is typical of the work that Prof. Zakhor has done for AFOSR/Air Force Research Laboratory over the years, that she meets these challenges head-on, and in most cases solves the problem sufficient to demonstrate a prototype system."

Sjogren noted that what is left for others is to examine the approach that was taken, and extend the techniques that were brought in, to a wider context.

"We are gratified to see how technology can drive science in a domain of critical relevance to practical defense implementations," he said.

Even though they don't have all the answers yet, the scientists are boldly looking ahead to how this technology can be used in the future when they plan to model entire buildings and develop interactive viewers that allow users to virtually walk through buildings before they are there in person.

In the meantime, the cutting-edge technology is being successfully implemented on campus.

"We have already generated 3D models of two stories of the electrical engineering building at UC Berkeley, including the stairway and that is a first," said Zakhor.