© 2019 SourceMedia. All rights reserved.

Optimizing “Zero Dark Thirty”


January 16, 2013 – On the big screen, a black helicopter glides toward a terrorist compound, stirring up dust and pointing out the direction of suspense and action to come.  

From a visual effects perspective, that helicopter, the lighting and every particle of dust and sand in every frame all go back to data points in a software program. And for the massive data loads for helicopter and other effects scenes in “Zero Dark Thirty,” feature film visuals experts Image Engine turned to some novel processing and scaling options.

“Anytime you’re trying to simulate dust or water or fire, it hits not only the performance layer but the capacity layer, and both quite hard. It means you’re accumulating terabytes within days because you’re simulating millions or billions of particles every minute,” says Gino Del Rosario, head of technology at Vancouver-based Image Engine. “In the chopper simulations in [“Zero Dark Thirty”], for instance, it’s a lot of things that ... fit in the movie and you don’t think twice like you would for something like a giant robot or space ship. But these are things that blend into the background that take painstaking care and data.”

Surrounding the manhunt for the most notorious terrorist ever, Osama bin Laden, “Zero Dark Thirty” is currently atop the U.S. box office. It was a nominee in multiple categories in Sunday’s Golden Globes, and is up for awards in multiple categories at the Academy Awards in February, including “Best Picture.”

But no matter the accolades, more film companies, and entertainment in general, are hitting a data wall like any other enterprise. One route was to stack and stack compute power into a purpose-built data center, as was the case a few years ago with the firm that handled the “Lord of the Rings” films and James Cameron’s epic, “Avatar.”

Ron Bianchini Jr., CEO and co-founder of Pittsburgh-based Avere Systems, which has a number of visual effects and imaging clients, says that some of that industry’s nimble approach to new data management methods comes from limitations on workloads and long-term planning.

“These media guys are out in front because of the volumes of data and processing,” Bianchini says.

Rather than perpetually add heads and more expensive fiber channel disks to its modest data center of about nine server racks -- two-thirds of which is condensed data -- Image Engine decided about a year ago to optimize its network attached storage. Without an optimization tool in place, Image Engine had to “work backward” on the storage side, gauging the speed of frames and requirements of shots to make a guess at the amounts when it came to storage and provisioning. After a few weeks of demoing, the feature film effects company went full-boar with a duo of Avere Systems’ 3500 Edge filers. This model of filer can handle up to 9 terabytes a piece and 50-to-1 transaction offloading, tracked via a real-time analytical interface that Del Rosario called a “life-saver” in terms of performance monitoring and adaptation.

Shots take up gigabytes and sequences terabytes alone, and as the data loads from “Zero Dark Thirty” and two other feature films went into production by Image Engine during 2012,  Del Rosario brought in five more Avere file clusters on to bulk up on rendering. He says he ruled out syncing a cloud to its nearby server farm because of the uncertainty of data loads and related costs. A NAS optimization route eases some of the scheduling and volume uncertainties with visual effects projects that make it “difficult to plan for your needs, storage-wise.”

“It started with a few boxes to overcome the limitations of our existing NAS deployment, and it was fantastic in how we were able to scale for our performance needs,” he says.

For reprint and licensing requests for this article, click here.