Siggraph has striven to foster creativity and advancements in graphics throughout its history. Over the years, the quality of graphics has progressed to a point where visual effects, animation, and filmmaking were as much a work of science as a work of art.

Producers and directors rely on a talented technical team to augment their work. There is a significant amount of time dedicated to proper pipeline management and scripting. The technical work on a film ends up requiring more time and difficulty to master the wanted effect in the film. This puts a lot of pressure on the creative team, requiring directors to evaluate if a costly scene change is worth it.

Technical support will always be needed as an essential part of the creation process. That said, rendering software, 3D engines, and other content creation tools are releasing features to streamline the production pipeline, leaving more time to creative decisions.

A long nemesis of the creative process is rendering. A project can be on hold for days while a video renders. To combat this, Conductor Technologies - a cloud rendering solution spun out from visual effects company Atomic Fiction - launched at Siggraph. In addition to data management and storage, the cloud renderer would increase the number of computers in a render farm from hundreds to thousands, dramatically decreasing the render time.

Conductor Technologies was not the only company previewing new rendering tools at Siggraph. Pixar's Renderman previewed RenderMan 22. It features live rendering, which has a more interactive editing element so studios do not need to render more than once. NVIDIA also previewed OptiX 5.0, a deskside AI workstation to give content creators rendering speeds of 150 CPU servers.

Increased rendering speeds are not as exciting as new creation tools or technology, but rendering is a huge obstacle in VR content production. High-quality visual effects in one scene in a film can take over 30 hours to render. VR films will take longer as more of the scene needs to be rendered.

This is also vital when looking at pre-rendered, non-interactive experiences like films. Interactive VR experiences will need real-time rendering with low latency, which largely relies on high-powered CPUs and GPUs. AMD and NVIDIA are increasing the capabilities and speeds of GPUs offered, but VR technology and experiences will be limited by the maximum quality capable by the GPU used. Some headsets like StarVR can perform beyond what the most advanced GPU can support. As the quality of VR hardware and experiences advances, processors will need to keep pace.

The new Timeline and Cinemachine workflow in Unity. (Image by Unity)

Content Creation
Rendering may be one of the largest hurdles for creatives, but new content creation tools are just as important.

Unity’s 2017.1 shipped in July, but the company was able to stir up more understanding of  its new cinematic features at Siggraph. Video editors and directors can now use Unity to produce films with little to no C# with Timeline & Cinemachine features. As a tool largely used by game developers, Unity has progressed to appeal to more industries than just gaming. Animators and video producers have little experience working with C# and learning the technology behind a tool eats up time. Now creatives can focus more on creating than perfecting their coding capabilities.

AI is also becoming a new tool in the creative's tool kit. NVIDIA led the discussion with products, sessions, and research progressing AI used in the production workflow.

NVIDIA's OptiX 5.0 is more than just a speedy renderer. The engine features ray tracing capabilities to aid in design and character creation, leaving more time to perfect the content. There is also an AI backed denoiser. In addition, NVIDIA is researching using AI to animate human faces. The team experimented by animating the faces of actors with only a video and animation data provided in partnership with Remedy Entertainment.  The process took five minutes for a result, reducing time by 80% in large scale projects.

While saving time in a project is ideal, this realistically will not shorten total time on a film project. Reducing render time, coding/scripting, and introducing AI tools will free up creatives to produce higher quality films with a larger focus on storytelling and more realistic visual effects, giving more power to the creatives.


Alexis Macklin is an Analyst with Greenlight Insights covering emerging technology trends at the intersection of VR, AR, and the entertainment industry. Follow her: @Alexis_Macklin.