Rendering the Sea: Exocortex Technology, Part 2

Quite often, simple things that occur in the real world can become incredible technical problems for visual effects artists attempting to digitally recreate or adapt those occurrences for film. One such gremlin is liquid simulation.

Previously, we looked at how Gradient Effects used the Exocortex Technologies Slipstream tool to create the memory pool in Harry Potter and the Deathly Hallows: Part 2, a sequence that required mimicking the effect of ink dispersing in water -- in large volume and high detail. Slipstream allowed Gradient Effects to complete the sequence with far less processing power, with greater efficiency and in far less time than other solutions, thanks to real-time rendering and a specialized production pipeline.

But while rendering ink in water is impressive, the power and majesty of the deep blue sea is something else entirely.

Remaking a Classic: “Moby Dick”
In the summer of 2009, Canadian VFX supervisor Will Garrett was involved in preproduction on Germany’s Tele Muenchen Group TV remake of “Moby Dick.” Filming was scheduled to start in early fall on location in Canada and Malta with William Hurt, Gillian Anderson and Ethan Hawke in the lead roles.

The large production presented Garrett with a number of VFX challenges, including finding a cost-effective way to replace William Hurt’s left leg with a peg for specific shots (which was achieved, by the way, by using a color-neutral gray sock with reflective markers and a lot of skilled compositing). He also needed to incorporate both live-action footage and a computer-generated whale in many of the ocean sequences.

Because these sequences, particularly those featuring the whale, would be critical for maintaining the movie’s emotional hold on the audience, it was important to choose the right technology while staying within the made-for-TV budget. Garrett selected Exocortex for its innovative technology, experience with liquid simulation and ability to quickly adapt its software to the specific needs of the “Moby Dick” production.

Exocortex’s Maelstrom Technology
While Gradient Effects used Exocortex’s Slipstream technology for Harry Potter and the Deathly Hallows: Part 2, Garrett chose Exocortex’s Maelstrom technology for his project.

Exocortex’s Maelstrom is similar to Slipstream in that both were designed to overcome the speed and scalability limitations of existing simulation approaches. However, Maelstrom uses a multi-patent-pending adaptive tetrahedra simulator core, a first in the VFX industry. The development of this unique adaptive tetrahedra simulator is the result of a three-year collaboration between Exocortex and Christopher Batty, a renowned researcher at The University of British Columbia and Columbia University.

Moby Dick Productions was the first production company to experience the benefits of this new approach to liquid simulation. “Garrett was fun to work with,” says Ben Houston, the founder of Exocortex. “He has that rare and very effective combination of great interpersonal skills, an eye for detail and the ability to leverage new technology in demanding situations.”

Building a Custom Solution for “Moby Dick”
The first stage of the 10-month project involved integrating the core simulator technology into Autodesk Softimage to allow rapid iteration and the ability to rerun simulations to add further detail.

Next, the team applied Batty’s research innovations to achieve high-quality interaction between the liquid simulator and the intricate meshes that represented the whale and the boats. A spray, foam and bubble system was also created to handle breaking-water situations.

Finally, Exocortex pushed the scalability of the simulator by automating and optimizing its adaptive nature.

The project was a joint learning experience: Both Exocortex and the “Moby Dick” team learned how best to use the simulator and improve the software to meet the needs of the production. The end result was a uniquely scalable and adaptive tetrahedral-based liquid simulation system tightly integrated into an Autodesk Softimage workflow.

The extensive feature set of the simulator allowed its use on a wide variety of water shots, including whale chases and underwater sequences. In the end, over 150 shots used the Exocortex liquid simulator. “Exocortex’s custom software significantly enhanced what we were able to achieve on a made-for-TV production budget,” says Garrett.

 

Photo via Gate Film Produktion

Liquid Magic: Exocortex Technology, Part 1

Longtime fans of the Harry Potter film franchise are familiar with the series’ ample visual effects (VFX), and this summer’s box office hit Harry Potter and the Deathly Hallows: Part 2 -- the final adventure in the series -- ranks as one of the biggest VFX-driven Hollywood productions of 2011. Among the many emotive sequences in the Harry Potter film franchise is the memory pool sequence created by Gradient Effects. Here is a behind-the-scenes look at how Gradient Effects and Exocortex Technologies worked together to pull off this visually stunning special effect.

A Large-scale, High-fidelity Sequence

Harry Potter
’s memory pool sequences required an ink-in-water look where the ink would behave realistically in a large volume, then quickly assume dynamic shapes that can seamlessly transition to live-action footage. This complexity presented Gradient Effects with a significant challenge. Not only did it require fluid simulations of unprecedented detail and scope, but it also needed a specific inky look that most traditional simulators have difficulty producing, even in restricted domains. And because the sequence’s timing and transitions had already been decided, there was no artistic leeway if the technology ran into limitations.

On the technical side, artists had to be able to see, in real time, the results of increasing the simulation resolution to more than half a billion points for final output. It was clear to Olcun Tan, co-founder and head of research and development at Gradient Effects, that traditional fluid simulators were not up to the task.

While searching for new simulation solutions, Tan discovered Exocortex’s Slipstream technology. “Before committing to any technology, we did extensive research by testing all well-known technologies for the type of work required on Harry Potter,” says Tan. “After seeing the first tests from Exocortex, I was instantly convinced that we had found our solution in Exocortex’s technology.”

Exocortex’s Slipstream Technology

Ben Houston, Exocortex’s founder, has long known about the limitations of traditional simulation methods, having led the Flood fluid simulator project at Frantic Films. Exocortex was founded with the belief that the traditional limitations could be overcome, giving artists more creative freedom while also controlling costs.

In the pursuit of these goals, Exocortex’s team had been researching and developing its proprietary simulation technology since 2008. The result is Slipstream, which allows rapid accurate previews -- and realistic and unbounded liquid simulations -- while cutting costs for VFX studios.

A key element of Slipstream technology is the elimination of a simulation bounding box. The technology simulates freely in any environment where one needs results at an arbitrary level of detail.

Slipstream is stable and predictable, so an artist’s real-time preview of a fluid simulation element is the same as the final beauty pass. Studios can handle tremendous amounts of simulation without excessive investment in hardware and personnel because Slipstream is efficient when it comes to both memory and computation.

Designing a Modern Agile Pipeline

The project’s compressed timeline -- together with the scope of research, development and production involved -- required the development of an agile pipeline that is capable of tying together software, creative talent and hardware. Tan started by combining Exocortex’s simulator technology with Gradient Effects’ proprietary technology running in Autodesk Maya.

Under Tan’s direction, Exocortex modified its simulator technology to be fully deterministic. Feeding in the same inputs to the simulator would produce exactly the same results every time. This addressed two very important elements of the production pipeline. It allowed artists to design the simulations in real time while assuring all their details would be realized as previewed when the simulators were rerun for the final beauty pass. It also allowed a single simulation to be split across multiple machines with the results automatically synchronized. To minimize the amount of data, camera-based culling and level-of-detail were employed.

Many-core Distributed Simulations

Using one machine per simulation proved to be inadequate because of the unprecedented detail and scope required for each shot, combined with the need to provide a workable turnaround time and the desire to minimize compositing tricks.

Because the simulations could be distributed across multiple machines, between 10 and 20 machines could be assigned to each simulation. This enabled an astounding 480 cores and 960 GB of memory to be simultaneously dedicated to a single fluid simulation.

The approach proved to be an effective time-saver with respect to simulation times, but the raw intensity of the computational power presented its own challenges.

A Storage System for High Compute Intensity

Gradient Effects’ simulation and rendering farm consisted of machines with 24 cores and 48 GB of RAM each. They churned through the simulation much faster than the simulation data could be transferred across the network to the storage servers.

At the same time, initially writing 1 GB per frame per computation node also created a bottleneck. To solve the problem, an innovative distributed storage system was created using the simulation nodes themselves.

The speed advantages of local storage greatly outweighed the minimal CPU cost per simulation node of managing a distributed storage system across the simulation nodes themselves. Many-core processors already offered significant CPU power, a small portion of which could be assigned to managing the distribution store. Even assigning one core to this task would reduce the overall machine’s simulation performance by only less than 5 percent -- significantly less than waiting for the completion of large data transfers to a storage server.

To make the distributed store more robust, the team deployed a background data-mirroring process. This simulation slicing allowed for the use of 300 to 500 million elements per frame of the simulation and a turnaround time of a mere few hours.

“Distributing the simulation, combined with the speed of the simulation engine, allowed us to simulate hundreds of millions of particles, whereas in that same time other tools could simulate only thousands,” says Dave Rindner, senior effects artist. “The biggest scenes required, at most, a few minutes per frame for simulation with particle counts of half a billion. Smaller scenes were only seconds per frame.”

Check out Part 2 of our Exocortex coverage later this month.

The Devil in the Details: Technical Artist Julian Love on Diablo III

Lead technical artist Julian Love is a nine-year Blizzard veteran. He’s been working on Diablo III almost as long as fans have been waiting for the game. We spoke to Love about the long-gestating project, Blizzard’s approach to making games and the role of the technical artist in development.

DIG: What has your role been in the development of Diablo III?

Julian Love: I lead the technical art team, which wears many hats and has a kind of broad effect across the entire game. We collaborate with pretty much every department, sometimes in a support role. For instance, we support the background artists in terms of being able to develop the backgrounds, and the character artists and the animators. We also work very collaboratively with the design team and programming departments in order to create the skills and skill system.

DIG: Can you describe the job of the technical artist?

J.L.: We’re a collection of artists that also have a better grasp on the technical side. We’re not afraid of programming or programmers. We understand how the engines work and can guide the programmers in terms of helping them understand what the artists need.

So in that sense, it’s a bridging role between art and programming. But in our team, it’s also a bridging role between art and design. We can see both sides of things -- the programming side and the art side. And sometimes that’s the view that you need to find out how to actually get where you want to go.

DIG: Is usability frequently part of the discussion? Does game design come into the equation for you?

J.L.: Sure. Combat is what we work on, and most of the effects in the game. Most of the role of effects in all games is to spell out the mechanics of the gameplay. That’s the first rule.

We’re not just there to make cool explosions and zombie barf. We do that. But the area that the zombie barf paints when it hits the ground is significant. It’s not just any zombie barf. Sure, I get to pick what kind of chunks I want to go in there. But, you know, it’s got to show, “This is what’s going to happen and it is in this big of an area.” And we have to serve the design first in order for it to feel right and be successful. So that isn’t going to happen unless we’re really big fans of game design. And that we’re really tied in and connected with that department to a high degree.

DIG: It must be tricky to make sense of all the stuff happening onscreen when you have multiple players and monsters all casting spells. Do you have a chart that says what kind of spell gets what kind of color?

J.L.: It’s actually a little more driven by concept. And the concept, in turn, does tend to allow us to have the kind of separation you’re talking about.

So when we sit down and design a class, we do talk at a really high level about what kind of themes we want to evoke with that class. And inevitably, there are certain color schemes that come out of that. And so we do end up with a little bit of a mental chart of colors that we would rather not have associated with a certain class. It’s kind of tough when you’re making runes -- like a 150-some-odd skills per class -- to avoid every color. But we do tend to exercise a certain amount of avoidance.

DIG: So the Witch Doctor is going to tend toward green or yellow.

J.L.: But because of his themes more so than anything else. He’s going to be throwing bugs and frogs and stuff like that at you. So that does kind of indicate a certain color thematic -- like acid cloud and those kinds of things. He’s got this big voodoo vibe, and so fire, poison and those kinds of things really feel like they belong strongly to him.

And when it came to something like the wizard, we are looking at it from a different angle -- this sort of cosmic magic theme. We wanted to make arcane feel ancient and powerful, so we kind of pick cosmic themes. And that led us to a lot of yellow and purple combinations. And blue lightning-type combinations. That allows us to get some separation both in terms of color and shape.

DIG: Is there a kind of flow between the teams where lessons are shared? How does that information exchange work?

J.L.: We look at games outside of Blizzard and lift things and we riff off of that stuff. It’s all good content. And we look at games inside Blizzard: World of Warcraft lifted a lot of stuff from Diablo in its design. And then they went and changed it and learned a bunch of lessons about it. And we look at that: “OK, what did you learn?” We borrow a little bit from that. There definitely is a lot of cross-team collaboration and discussion that goes on. And a lot of lessons learned that we can take advantage of.

DIG: I’m curious how that information flows through the company. Is there a meeting?

J.L.: There are those kinds of meetings. Within one discipline, the technical art leads and the technical artists meet on a regular basis to talk about what each of the teams are doing.

But I’ll tell you the way that it happens more often than that. We’re all huge fans of our own games. We all play the crap out of our own games. I play WoW. We all play it. And we all check out the cool stuff that they did. And we all get excited: “Oh, wow, they made this better. This feels cool. And I like how they treated the art in this area.” We get excited about it, and that stuff comes over to our team and vice versa.

We also have newsletters that go out once a month or once a week where we highlight what each of the teams are working on so we can kind of stay on top of new developments and even trade little secrets about technology or approaches.

Strategic Insight: The Rise of Paradox Interactive

It seems unbelievable that a company that specializes in hardcore, complex turn-based historical strategy games with titles like Crusader Kings, Victoria and Hearts of Iron can be successful in this day and age -- let alone be an industry innovator -- but that is exactly what Paradox Interactive is. Ahead of the curve in digital distribution, niche marketing and fan collaboration, the company is set to continue to grow on their own terms.

We had the opportunity to talk to producer Shams Jorjani about the company, its history, its philosophy and its bright future.

DIG: Can you give me a brief history of Paradox Interactive?

Shams Jorjani: Paradox started as a small dev studio in 1999 and Europa Universalis I was our first game. We worked with other publishers but soon realized that if we wanted to grow properly, we’d need to handle a bigger part of the operation ourselves.

So when we finally had money enough to hire one more person, the choice was between a marketing person and another programmer. The developers were, of course, furious that we’d even consider anyone besides a programmer: “Good games sell themselves” was their reasoning. Paradox hired the marketing person, and from that day the company was changed forever -- for the better.

But starting a publishing business is no small feat. We had to start with signing anything we could afford just to keep moving. All those games were not, shall we say, the finest moments in PC gaming (see Stalin vs. Martians), but they allowed us to continue working and sign better titles.

Once in a while we found a diamond in the rough that took off, which in turn allowed us to sign even better games. Being small means flexibility is a key element to success. We decided early on to focus on digital sales. We’ve always been ahead of the curve there compared to other publishers who still rely on brick-and-mortar retail.

Today we’re in a situation where we only sign on games that we really believe in and think we can do something fantastic with. While our development team has grown considerably in size, the publishing side has done so tenfold in the amount of games, expansions and DLC we release annually.

DIG: Over the years, you’ve specialized in historical PC-strategy games. Was that always the goal?

S.J.: Our motto has been, “Strategy is our game.” We started off focusing on strategy games because we were good at it. As time progressed, we realized it wasn’t strategy games per se; rather, we became good at making, showing, marketing and playing games that others would shy away from. Simply put, we try to make smart games for smart gamers, so maybe “Niche is our game” is a more apt motto for us now.

DIG: Paradox strategy games are lovingly intricate, reminiscent of the old Avalon Hill board games. How has catering to that niche shaped the company? Has it been a limitation or an advantage?

S.J.: It’s mostly been an advantage. It’s definitely helped us establish a loyal fan base and allowed us to grow. It has also taught us to really get to know our audience and understand what makes them tick. Sticking close to the gamers is a very important part of how we work. But as we branch out, it’s been a challenge for us since there’s not always a huge overlap between our core game audience and other genres.

On the other hand, that also allows us to release a Hearts of Iron game on the same day as Modern Warfare -- not a luxury many other publishers enjoy. Our gamers don’t necessarily care about headshots and kill streaks compared to supply lines, diplomacy and historical accuracy.

DIG: Embracing complexity seems to fly in the face of conventional game design wisdom, which encourages increasing intuitiveness. Does that conventional wisdom play a part at all in the Paradox philosophy? Will we ever see a simple strategy game from you?

S.J.: I think it’s important to differentiate between complexity and accessibility. A game can be incredibly complex, but still be accessible. It doesn’t take long to learn chess, but mastering it takes a lifetime.

Chess, however, has a fairly simple set of rules. Our games have complex rules and a tremendous amount of depth. Furthermore, they’re fairly punishing when you make mistakes -- not exactly beginner-friendly.

We’ve made tremendous strides when it comes to improving the user interface and player feedback, but we have a long way to go still. This is our Holy Grail. I don’t think you’ll see a simple strategy game from us (say a RTS), but I think you’ll see a strategy game from us that is simple to learn.

DIG: How important is fan feedback? And how do you work with the modding community for your games?

S.J.: It is extremely important. Fans are the lifeblood of gaming companies -- getting too detached from their needs is very dangerous. On the other hand, it’s dangerous to listen too closely; you’ll run the risk of losing objectivity. As developers, we’re tasked with driving innovation and staying fresh while still appeasing the existing fan base.

Modding is also very important for us; it acts as a source of inspiration and keeps us alert. It’s also a great way to keep the community active and involved. We’ve even signed on talented modders and turned their work into full-fledged commercial products.

DIG: In terms of the current climate of change in the industry, how important is it to both publish and develop games?

S.J.: Today’s gaming climate renders the need of a publisher less relevant. More and more independent developers can release games on their own. But as more do it themselves, it becomes harder and harder to push through all the white noise. And that’s where the publishers become relevant again, if you want to achieve true success.

Paradox releases roughly 20 to 25 games every year. Of these, 15 percent are internally produced games. The lion’s share are externally developed titles that Paradox helps develop, market, distribute and sell. Paradox has increased the size of the publishing team considerably the last few years, which means that when we find a new game that we think can do well, we can put considerable resources in helping the developers hit their marks and make the great game that was once just a cool idea on a piece of paper.

We really try to work as a partnership. This means that we put a large amount of trust in each other’s work. The developer gets to make their game in peace and quiet, and they trust us to do the best marketing and sales job we can do on the publishing side. The producers are there to help developers avoid problems, offer input, remove threats and offer an outside view on the game to make it better. Our marketing/PR side focuses a lot on social media, and the sales department sells!

Sword of the Stars Screenshot: http://www.ign.com

The Development of World of Warcraft: Cataclysm

With over 11 million subscribers worldwide, Blizzard Entertainment keeps online gamers coming back for more World of Warcraft (WoW) by consistently adding to the virtual world of Azeroth. Cataclysm is the most ambitious expansion to date for the massively multiplayer online (MMO) fantasy role-playing game. While most of the attention has been focused on the new 3D facelift that the game has undergone, Cataclysm is pushing the linear aspect of interactive entertainment forward with its Hollywood-inspired, in-game cinematics.

Much has changed at Blizzard Entertainment in a short period of time. “In the early days we were a much smaller group that came through school studying film,” says Jeff Chamberlain, project director at Blizzard Entertainment. “As we’ve grown, we’ve brought in a lot of the Hollywood talent, and they’ve been able to bring their experiences from those studios. It’s created a melting pot of Hollywood studios and video game developers, which has been very beneficial for us.”

One such person who migrated from Hollywood to the game world is Terran Gregory, associate director of Cataclysm. Gregory says that films’ 100-year history, including its more recent venture into computer animation, has given the craft a large head-start over video game storytelling. But the team at Blizzard has learned a lot from those years of filmmaking. “I would say gaming has probably learned about storytelling from Hollywood, and maybe Hollywood has learned a little bit about technology from gaming,” says Gregory.

Creating Deathwing
0Blizzard’s biggest endeavor yet on the cinematic front was introducing Deathwing to the WoW faithful and establishing what has become a very important character in the game universe. Marc Messenger, director for cinematics at Blizzard, says the role of pre-rendered cinematics goes beyond just watching the mini movies.

“We want the player to remember the cinematics and how they inform the in-game destruction to give a truly epic sense of what it would be like to stand in the presence of a thousand-foot wave or see fire streak across the sky,” explains Messenger.

The process for bringing cinematics to life starts with brainstorming a vision and idea. In a nutshell, Messenger says it’s about finding a way to succinctly convey the character of the current expansion and make it as cool as possible. “We just sit down and roll through a bunch of ideas,” says Messenger. “On Cataclysm, in the first meeting, the game team had a strong idea of what the expansion was going to be and we were able to get in sync pretty quickly.”

“Once Marc and the crew had their idea down, we functioned as a normal Hollywood animation studio,” says Chamberlain. “We storyboard everything, have a fast iteration process and then go through a series of reviews and approval processes. Once we are locked down on something we like, we start working like a normal animation studio with animation, modeling and the typical artistic departments.”

“One technology that we developed for Cataclysm was a new camera approach where we could actually get our 3D world using a motion-recorded camera so the director could film his subject in real time,” explains Chamberlain. “That added a little bit of flavor to the experience and made it more like a Hollywood project.”

“New tools have helped us make things more cinematic in the game through the use of cameras, as well as the control and manipulation in real time of actors that we can work through,” says Gregory. “It’s a different world working with the game itself, instead of just 3D. Being able to walk around the environment as if you’re on a set with the actors, and really have a feel of the space as you move them around, was important. We even had people piloting the characters around so it was like working with talent, instead of just working with objects.”

These new advances have allowed the team to improve the visual fidelity of the characters in cinematics. Improved facial animation was just one element that brought more believable characters to life in the game. And Gregory says technology is constantly evolving, which means the next round of cinematics will push the bar even higher.

Cinematics Going Forward

The ultimate goal of Blizzard’s cinematic teams is to get the player emotionally involved with the characters they’re interacting with as the story unfolds. Technology plays a crucial role in helping the programmers, artists and effects wizards conjure more believable and higher-fidelity characters to which gamers can connect and identify.

“It’s fascinating that we are at this place in time where we can move people emotionally through a video game,” says Chamberlain. “I don’t know if that could have been said 20 years ago at the dawn of video games. Now it seems like that line between simple gameplay and embracing a story is getting increasingly blurred. They’re becoming one and the same thing.”

The other line that’s becoming more blurred in video games is the one between cinematics and the gameplay experience.

“For in-game cinematics, there’s always the challenge of making the cinematic presentation not exclude the player,” says Gregory. “Technology plays a lot into that as we try and look for more ways to make the transition a seamless experience. With StarCraft II and Cataclysm, we’ve started to include the player’s character in the cut scenes. We’re just getting into that now, and the future looks really bright with new technologies allowing us to achieve that.”

Image Source: http://us.battle.net/wow/en/media/wallpapers/