|Cloudy with the Chance of Meatballs shows more 3-D potential.|
The exhibition has closed but the conference continues on this last day, with 3-D screenings (including some new Cloudy with a Chance of Meatballs footage) and E-Tech still attracting lots of attention, even spread out in several rooms and isolated on the third floor. The emphasis is on interactive sensory techniques. And what’s hot? “The interconnectivity between the machine and human,” asserted Manabu Sakurai, Emerging Technologies chair. “If you see the gaming industry, your console is not just a game console anymore — part of your body is the game console. And the machine understands your natural movements. And then, in an opposite way, the display, what you see is also getting closer to humans. The key word is 3-D.”
What’s been popular in E-Tech? A trampoline that’s part of the interface; “Back to the Mouth,” which transforms your mouth odor as ammo through sensors and allows you to shoot a monster; touchable holography; and a robot that folds your shirts.
And what was hot on the exhibit floor? I previously mentioned mental image’s Iray, the interactive lighting tool for design visualization. I saw it demo’d and it quickly and impressively rendered artifact free images of a turntable, a car and an office space with challenging global illumination.
Rolf Herken, mental founder, said it will be available with mental ray 3.8 and Reality Server 2.4 in November, and provided an overview of Iray.”We embarked on an effort to utilize the computing power of the GPU, and there were a lot of people who said, ‘Why don’t you just make mental ray faster?’ The problem with that is mental ray is fully programmable and the most powerful and versatile software on the planet. But this programmability doesn’t lend itself particularly well to implementation on special purpose architecture or the GPU, because obviously someone else would program the shader and we don’t know what the result would be. So the resulting speed up might be pretty disappointing and, anyway, beyond our control. So we thought it was probably better to focus on a range of applications or use cases that are extremely relevant to probably the majority of our users. And it has to do with product design/visualization and a special visualization where the physical correctness of the visualization matters. The resulting image has true characteristics of a photograph and the material properties that you are seeing are actually simulated real world materials. And the resulting image should not differ from what you would get in the end if you built the product or built the house or whatever. It’s a very, very large market and requires an algorithmic approach that is simulation rather than making a nice image with a lot of tricks.” In other words, avoiding the complicated and arbitrary parameters that you have to set controls to get an image that looks artifact free but isn’t. And so the goal of Iray is to give people a rendering algorithm that harnesses this enormous power of the GPU or whatever massively parallel architecture is out there and it works like a camera. You take a picture and you don’t have to think about these things that you have to do in order to prevent artifacts in an image.”
Again, also impressive was the MachStudio Pro 1.2 from StudioGPU, which is bundled with the AMD ATI FirePro V8750 graphics accelerator. This is a very artist/TD friendly system that is akin to having a virtual 3D realtime studio environment. New features include displacement mapping using hardware tessellation, stereoscopic camera support, a cartoon shader, unlimited independent render passes and render layers. The Third Floor is now adopting MachStudio Pro 1.2 in its workflow and a test the previs company did on Valkyrie was demo’d by StudioGPU.
3D Equalizer 4 motion tracking software from Science-D-Visions offers both forward and backward auto tracking for frame range adjustment. Most of the code has been rewritten for Windows in Python with customizable GUI. It was used briefly on Avatar for lining up photogrammetry and models involving Zoe Saldana’s head.
Shotgun Software and Tweak Software demo’d the benefits of their new, seamlessly integrated alliance. Shotgun offers invaluable production tracking and RV offers complementary image and sequence viewing. It’s a natural fit. Framestore, Dr. D, Laika, Zoic, Aardman and many more are on board as beta clients. Web-based tracking of scheduling and notes, customized tasks, more flexibility. ” It’s the Facebook of production tracking,” suggested Don Parker, Shotgun co-founder.
|Starz's 9 is an impressive feature debut.|
Meanwhile, SIGGRAPH always offers opportunities to meet new people, and I had a chance to sit down and chat with Terry Dale, VP, operations for Starz Animation Toronto, which completed Shane Acker’s 9 and is now hard at work on Gnomeo and Juliet. Dale was one of the guest speakers at Autodesk’s semi user group reception Tuesday night.
“9 was a great learning experience for the studio,” he said. “It was a very compressed timeline — about 14 or 15 months from start to completion for us. It meant we had to be very innovative on the techniques that we used to get exactly what the director wanted to see on screen. It’s a dark film both in content and visually, which proved to be a pretty big challenge for us. A lot of the light is in the lower third of the spectrum, so there’s a lot of subtlety and that took a lot of work to get balanced out. We’re mainly a Maya shop but we do have a lot of custom developments within Maya. We used Fusion for compositing: again, we have a lot of proprietary tools that are written into Fusion for relighting so we don’t have to go to render again. There was a lot of heavy matte painting done on 9 in order to get the richness of the detail, so a lot of our artists worked with matte flats and being able to paint a whole bunch of different layers to give you the depth and realism you were looking for in those shots.
“For characters, a lot of heavily detailed character rigs were used, especially within the facial rigs. They were extremely complex in order to get that expression. We also have a pose panel tool which allows us to library any shape or animation, so it can be recalled and reused. It’s a very texture rich movie so there was a lot of detail done in the surfacing and texturing of not only the characters but also the backgrounds, sets and props. We tried very hard to seamlessly blend where matte paintings and 3D geometry would intersect and meet. Again, for the characters themselves, using a burlap texture was very adventurous because it shows any imperfection and stretching of how the UVs and surfacing is done. So a lot of attention had to be paid to the UV mapping and how it was applied to the characters so it didn’t look like a false stretching material.”
And how is Gnomeo and Juliet coming along? “It’s probably the most adventurous project we’ve done to date. It’s much more photoreal looking than 9. A lot of garden development: plants, trees; water development as well. There are fountains everywhere. A lot of work on detail because the gnomes are small and very close to the ground, so we have to be very cognizant of how the camera moves and what details it’s close to. The surfacing of the characters is important so they look like real gnomes and not cartoony. Rigging has been a challenge because they are an odd human shape. We want to ensure that we get the performance that we really need so there’s a lot of facial and body work.