Maurice Patel on trends in CGI

Around the time of SIGGRAPH 2009, we investigated the link between the papers presented at the renowned computer graphics conference and the various shifts in focus of the CGI industry. We asked Autodesk’s Senior Industry Marketing Manager for Media & Entertainment,  Maurice Patel to help.

A noted authority on media and entertainment trends, Maurice Patel is a frequent public speaker. He is regularly called upon to share his expertise on topics ranging from digital cinema, digital intermediates and digital lifelike environments, to the future of visual effects and Bollywood’s growing footprint. Maurice has keynoted at FICCI FRAMES in Mumbai, and has presented at countless industry events, including The Hollywood Post Alliance (HPA) Technology Retreat, Digital Hollywood, The Entertainment Technology Alliance Conference, The Society of Motion Picture and Television Engineers (SMPTE) Conference and The National Association of Broadcasters (NAB) Convention.  He is responsible for Autodesk’s presence across the film, television and games segments.

Fired By Design: Can you think of any films, TV, games or commercial vfx that broke new ground in these areas, after certain topics were popular for discussion?

Autodesk’s Maurice Patel

Maurice Patel: There is in general a strong correlation between SIGGRAPH research papers and trends in the industry. Although not all papers are equally influential, many inspire areas of active research and development within the industry either by software developers like Autodesk or our customers. We therefore see a significant correlation between the research being published and the appearance of these technologies in entertainment projects. Almost every visual effects blockbuster movie, or next-gen AAA game title involves a significant amount of original research and development – some of which gets published as papers and some is inspired by others. A few examples include:

• Work in realistic rendering and the development of advanced rendering technologies related to landmark visual effects movies like Jurassic Park and Titanic.
• Image based rendering techniques such as the work of Debevec and projects like the Matrix and the hurly burly scene. (Neo attacked by hundreds of Agent Smiths)
• High Dynamic Range rendering such as that used in the latest games from Valve.
• When warping/morphing research was published, the technologies became popular in the industry, as seen in Michael Jackson’s 1991 Black or White morphing video.
• The results of research link to groundbreaking animation like that in Toy Story (rendering), Surf’s Up (fluids), Monsters, Inc. (hair), etc.
• Game developers have been focused on maximizing GPU acceleration for a long time. In games, most titles for the Xbox 360 (launched Nov 2005) and PS3 (launched Nov 2006) have GPU accelerated rendering. Half-Life 2 and Metal Gear Solid 4 (MGS4) were extremely innovative in terms of GPU shading. Half-Life 2 pushed normal mapping driven shaders to a new level and MGS4 had a lot of innovative GPU-driven facial detail.
• Spore is a well-known example of arbitrary creatures with locomotion.
• Performance driven character animation is gradually traversing the uncanny valley, with The Curious Case of Benjamin Button being the most successful project to date.

FBD: Can you think of any advances in off the shelf software/systems that followed (perhaps a few years after) any of them?

MP: There are many examples either of research from off the shelf software companies (from Autodesk to Microsoft) that has been published as well as published research that has led to off-the-shelf software. Often software companies like Autodesk work extensively on rendering research, which can be highly theoretical, into practical applications – e.g. features or applications that allow broader creative control, are easier to use and process in reasonable amounts of time and with reasonable resources. Some examples are discussed below.

• Subdivision Surfaces: Pixar’s Gerrie’s Game was one of the first movies to use subdivision surface technology. Autodesk Maya software has subdivision surfaces – this technology was built based on some of these papers (including Jos Stam’s SIGGRAPH 1998 paper – Jos Stam is a Principal Scientist at Autodesk). Subdivision surfaces are now widely used in movies, and almost every major 3D package supports them in one form or another.

• Fluids: Autodesk Maya Fluids implementation is directly based on the “Stable Fluids” 1999 paper by Jos Stam, and the “Visual Simulation of Smoke” paper by Fedkiw, Stam and Jensen. Ron Fedkiw is a professor at Stanford and consults at ILM – a lot of his fluid work has been used in shots created by ILM. Jos Stam is a Principal Scientist at Autodesk. The game Plasma Pong is directly based on the “Stable Fluids” paper.

• Simulating Reality: We’re going to see increasingly realistic characters, cloth, fluids, environments, etc. For example, Maya Nucleus technology, which includes Maya nParticles and Maya nCloth, already allows artists to direct and control particles, cloth and other material simulations quickly and in entirely new ways.

• Animation and Artificial Intelligence: Visionary storytellers pushing the boundaries in games are creating increasingly believable characters that live in dynamic worlds. To do this they are using Autodesk middleware like Kynapse and HumanIK, which have been used for over 100 AAA titles, including FIFA 09 and Warhammer Online: Age of Reckoning. Autodesk middleware helps game characters move and react in more believable ways. Autodesk Kynapse gives characters perception of their environment in real time, so that they can react dynamically to a changing environment. Autodesk HumanIK is a full body inverse kinematics system that modifies animation in real time, procedurally responding to game environments.

• Facial Animation: Autodesk Face Robot is a complete facial animation system. It streamlines the production of facial realism and detail blending procedural algorithms. The development of Face Robot was driven by artist input.

• Autodesk Flame visual effects system: Several of the features in our flagship Flame system followed published computer graphics research, e.g. morphing, particle system, subdivision surfaces, HDR images, real-time performance using shaders.

FBD: Can you see a probable course for implementing them in vfx/games or possibly think (or dream) of a way in which they could become part of a software application in the future?

MP: What is likely to become part of future software applications:

(a) Automatic Character Motion: Animating characters individually from scratch is a lot of work. The idea is to specify the behavior of a character at a high level rather than through key framing. In this manner one can more easily create large crowds and speed up the animation process. One could then further refine some key characters through key framing to achieve a specific effect. This can save production houses a lot of work and money.

(b) Templating: Animation technology will revolve more around templating. Complexity is the force that fragments 3D and it can only be proceduralized and simplified by a templating frame. This covers all things: shading, modeling, animation and effects. Artists need interfaces to move their data into templates that computers can understand and repurpose. For example, with motion capture there would be a framework to have marker data to do something meaningful with a character. And then there is a need for powerful, flexible frameworks that make these templates responsive to the range and statistical patterns of what people imagine their characters doing.

(c) Shaders: In the future, shaders will be used to improve the rendering quality, physics engine and all related work on rigid and soft body animation.

(d) Intuitive Interfaces: These will challenge the mouse and keyboard set up in the future. In 2003 the Sony EyeToy used computer vision and gesture recognition, letting players interact with games using motion, color detection and sound. There were also camera-based tools for Xbox, and the Guitar Hero game first published in 2005 introduced a guitar-shaped peripheral. However, the 2006 launch of Wii was the tipping point for intuitive interfaces with a new group of users becoming gamers. Intuitive interfaces like Microsoft’s Project Natal, Sony’s PS3 Motion Controller and the Wii Motion Plus will continue this trend. In movies we are seeing groundbreaking developments in virtual cinematography, where intuitive interfaces coupled with real-time software applications like Autodesk MotionBuilder, are being used on virtual movie sets at soundstages for projects like Avatar and Tintin and A Christmas Carol.

(e) Connectedness: It will change how we work and how we are entertained. There is a generational shift coming, with young people today growing up with more connected lives than ever before. One of the first benefits of connectedness is mass collaboration. Early examples include flash mobs, the worldwide effort to find the SARS pathogen, Fold It and Jane McGonigal’s game I Love Bees where players across the US coordinated to solve a shared problem. Mobile game designers are beginning to grapple with the opportunities of augmented reality where the phone’s GPS determines real world location – providing a backdrop for a new type of gaming experience. More recently, people are leveraging social networks like Facebook and Twitter to report on the situation in Iran. How connectedness transforms or creates a new platform for entertainment is yet to be seen.

(f) More 3D reconstruction from 2D images.

(g) Increased integration of computer vision research.

You may also like...

HTML Snippets Powered By : XYZScripts.com
error: Content is protected :)