Aguru Images Shines Light on Digital Imaging

In the world of academic technology transfer, one contact often leads to another, ultimately resulting in new discoveries that enter the marketplace.

That is the story of Aguru Images, which merged brilliant academic discoveries from universities on opposite sides of the United States — New York University (NYU) and the University of Southern California (USC).

The result is creating a buzz in the computer graphics community because the technology can be used in a variety of applications. They range from making strikingly accurate digital images in motion pictures, videos and computer games to improving renderings by interior, fashion, architectural and industrial designers.

In fact, just about anyone who wants realistic lighting on everything from faces to brushed aluminum to fabrics may be able to benefit from Aguru’s technology.

The Virginia-based company now sells equipment and services and will eventually add software and a library of illumination data to its products. Appropriately, one of the meanings of Aguru in the Sanskrit language is light. Steve Gray, the chief technical officer and executive producer of Vykarian, a Shanghai-based game developer, says Aguru has made a great leap forward.

“It’s pretty remarkable what they’ve done working with NYU and USC,” says Gray, who formerly was the executive producer for Electronic Arts’ “Lord of the Rings” video games.

The Right Mix of Technologies

In the computer graphics world, getting textures right is extremely difficult, Gray points out.

“But along came Aguru, working with NYU, which has a scanner that uses a kaleidoscopic array,” says Gray. “You can stick anything on it and it basically figures out how the light, including light that is being scattered below the surface, is reflected. That’s something.”

Next, in conjunction with USC, he notes, Aguru added a dome that is a three-dimensional scanner.

“It essentially takes pictures simultaneously from many different angles,” Gray explains. “Then, using image-based modeling techniques, allows you to reconstruct a 3-D model of the thing that was inside the dome. One of the first applications was scanning people’s faces and it achieved really, really high resolution. It uses five million polygons, which is essentially more resolution than can be rendered back with film.”

The third component, also from USC, is called the Linear Light Source (the Aguru Scanner). It is a larger type of scanner that captures flat object reflection, the different colors of shine and the surface bumpiness, among other features.

The technologies work together well, Gray says, because the USC dome recreates the subtleties of 3-D shapes and textures for really difficult objects and materials, like the human face, while the NYU and USC scanners capture the properties of countless other materials.

It’s exciting stuff, and it’s 99.9 percent photorealistic. It makes it cheaper, faster and unlocks these capabilities for everyone, like small game developers and boutique studio effects studios.
Steve Gray

Collaborating With Researchers on Two Coasts

Aguru’s story began in 2005, when Saul Orbach was hired by ANGLE Technology Ventures, a publicly traded British firm, as its entrepreneur in residence. Back then, Orbach knew little about computer graphics, kaleidoscopic technologies, properly illuminating textured surfaces or getting the light right when scenes with real actors are meshed with virtual backgrounds.

Nor did he know that the quest for truly photorealistic digital images had long been the holy grail of the three-dimensional computer graphics industry.

In the past two years, however, he has become something of an expert of sorts in the field of virtual lighting — thanks in large part to researchers at NYU’s Courant Institute of Mathematical Sciences and USC’s Institute for Creative Technologies.

Orbach is modest about what he put together.

“I’m no scientist,” says Orbach, a serial entrepreneur. “When I started out, my job was to find technologies that we could commercialize and turn into a company. So I looked at a lot of things.”

Because of his NYU connections — Orbach earned undergraduate and M.B.A. degrees from the university — he met with Robert Fechter, associate director of the NYU Office of Industrial Liaison, who led him to the Courant Institute.

Fechter introduced him to Ken Perlin, a computer science professor, and Jeff Han, a research scientist at Courant. They showed Orbach a variety of computer graphics projects they were working on.

“One was the kaleidoscopic technology that was capturing reflectance data for textured materials,” Orbach says. “This was exciting stuff.

“In fact, it was a ‘Eureka!’ moment,” he said. “The existing methodologies for doing the same thing were very complicated. But Ken had come up with a simple, brilliant device with no moving parts that was quick and easy to operate.

“I like simple, innovative solutions to problems that no one has thought of,” he said. “This was really cool.

“Ken’s device allowed you to capture all angles of illumination. Then you could look them up from his database. His device could capture all those angles in 15 seconds. Learning about that was what started it all.”

Appropriately, Perlin was a fan of kaleidoscopes as a child.

“Wasn’t everyone?” Perlin asks. “As part of my research, I learned that the kaleidoscope was invented by Sir David Brewster in the early 1800s. It was so wildly successful that it became the symbol of science and progress. It was the iPod or computer of its time.”

Prior to this invention, in order to see a surface from multiple points of view you either needed an array of cameras or to mechanically move a single camera to different locations.

“I thought, ‘Why not use a tapered kaleidoscope to do the same thing?’” Perlin recalls. “Then I talked to my colleague Jeff Han and we set about to build it.”

As part of his due diligence, Orbach says he learned everything he could about the industry and the difficulty of getting realistic lighting on computer generated images.

“Along the way, I talked to 50 people and companies who were possible users of this technology. I got a good feel for the market, what the applications were, how much and for what price,” notes Orbach.

Orbach also met with Paul Debevec, a research associate professor at USC’s Institute for Creative Technologies Graphics Lab and a friend of Perlin’s who had come up with a complementary technology.

Debevec’s Light Stage 2 process was used by Sony Pictures Image Works to create photorealistic digital actors as part of its Academy Award-winning visual effects in “Spider Man 2,” the Academy Award-nominated visual effects in “Superman Returns,” and most recently “Spider Man 3.”

Debevec, who most recently led the development of a Light Stage dome that measures 26 feet in diameter, says his research began in 1999 and is funded by movie studios and digital imaging corporations. The institute’s basic research funding came from the U.S. Army.

“The goal of our institute is to foster collaboration between academic researchers and the entertainment industry to develop the next generation of simulation and virtual environments,” Debevec explains.

“With Aguru commercializing some of our technologies and pushing them further forward, that will help meet these goals,” he says. “It will allow more people to benefit from these physically based rendering and realistic model acquisition techniques. Aguru is going to take proven technologies, make them more robust, more economical and will adapt and evolve them to better meet the specific needs of the industry. And that will inspire our group to work on the next generation of these technologies.”

Making Its Marketplace Debut

Aguru Images was launched in March 2007 with an initial $1 million in financing from ANGLE. NYU licensed its technology to Aguru in exchange for equity and revenue. USC negotiated a similar deal.

By August 2007, Aguru was ready to show its products at the 2007 SIGGRAPH trade show in San Diego. (SIGGRAPH stands for Special Interest Group for Graphics and Interactive Techniques.)

John Sweet, a senior licensing associate at USC’s Stevens Institute for Technology Commercialization, calls working with Aguru Images a “pleasure.”

“Saul Orbach is an experienced entrepreneur and has assembled a good group of guys,” he says. “It’s nice to collaborate with people like that who know what they are doing.”

Orbach estimates the company will have revenues of between $5 to $7 million in 2008. And after that, who knows?

“As for future applications, we quickly came up with a short list of 15 or 16 that includes online commerce, cosmetics, catalog shopping, medical applications and even military stuff,” says Orbach.

“There is no shortage of how this technology could be used,” he muses. “The sky’s the limit.” 

This story was originally published in 2008.

To see available technologies from research institutions, click here to visit the AUTM Innovation Marketplace.