on Wednesday, December 26, 2012
Today I came upon an excellent post about "ten upcoming technologies that may change the world" by Alvaris Falcon. Great food for thought, allowing me to catch up on all these new technologies and expand upon two of them-Google Glass or augmented reality and 3D printing or rapid prototyping based upon my exposure to them this past year.

AUGMENTED REALITY (AR)
Ten years ago the technology introduced in the film "Minority Report" stood marketing experts on their ears.
Tom Cruise wearing "data gloves" in front of translucent
glass in the 2002 film "Minority Report"
"...the concept of Augmented Reality conjures up that memorable scene in Steven Spielberg’s 2002 movie Minority Report in which Tom Cruise’s character strolls through a mall while being assaulted by marketing messages of a highly personal nature. A Guinness billboard addresses him by name and tells him he could use a drink; in the Gap store, a hologram of an assistant asks him if he is enjoying his previous purchase; an American Express advert shows a giant 3D credit card embossed with his membership details. It is an unsettlingly depiction of an advertising nirvana, all made possible by the supposed existence of retinal scanners."
The data glove interface depicted in the photo above is a form of virtual reality (VR). VR differs from AR in that it is an entirely virtual experience and has no anchor in the physical world. AR experiences on the other hand, use computer-enhanced glass to augment or overlap reality in real-time with sensory data or other useful information as applied to the world around us.  We are familiar with the AR in this montage taken from the 1984 film "The Terminator":

"In the Terminator movies, Arnold Schwarzenegger’s character sees the world with data superimposed on his visual field—virtual captions that enhance the cyborg’s scan of a scene. In stories by the science fiction author Vernor Vinge, characters rely on electronic contact lenses, rather than smartphones or brain implants, for seamless access to information that appears right before their eyes." [Augmented Reality in a Contact Lens, by B.A. Parviz 9.30.09]
Bionic Contact Lens "A Twinkle in The Eye"
"Confusingly, both AR and Virtual Reality share key elements that allow users to experience enhanced interactions through digital and online input, and often the terms are used interchangeably: with the increasing advancements of gesture-based interfaces (think the Kinect), distinctions between Virtual Reality and AR are becoming increasingly irrelevant."  [The Next Web : "How augmented reality will change the way we live" 8.25.12]



Google Glass is hailed by TIME magazine as one of the best inventions of 2012.
"[Google] Glass is, simply put, a computer built into the frame of a pair of glasses, and it’s the device that will make augmented reality part of our daily lives. With the half-inch (1.3 cm) display, which comes into focus when you look up and to the right, users will be able to take and share photos, video-chat, check appointments and access maps and the Web. Consumers should be able to buy Google Glass by 2014."
"Sight", a short sci-fi film by Eran May-raz and Daniel Lazo (publ. 8/1/12) imagines a world in which Google Glass-inspired apps are everywhere which Forbes tech writer AW Kosner says "makes Google Glass look tame".


Beyond personal eyeglass, General Motors Corp. researchers are working on a windshield that combines lasers, infrared sensors and a camera to take what's happening on the road and enhance it, so aging drivers with vision problems are able to see a little more clearly.

Another AR application for smart windshield glass under development in the UK:

Meanwhile here in Los Angeles a law has just been passed banning digital LED Billboards that are distracting to drivers. The future of advertising has palled on the Sunset Strip. http://www.npr.org/templates/story/story.php?storyId=96875271 Not sure how well windshields augmented with sensory data are going to be received by consumers (or insurance companies.)

Connecting this back to the entertainment industry, John C Abell wrote an interesting article about the roots of AR in sci-fiction for Wired: [Augmented Reality's Path From Science Fiction to Future Fact 4.13.12]
"There can be a very thin line between fantasy and science. Fantasy drives science. Set aside Geordi’s visor and today’s augmented reality glasses for a moment. Instead, look at some original Star Trek episodes to see handheld, long-range wireless communication devices and voice-input and omnipresent and seemingly omnipotent computing half a century before these nonexistent technologies became things we take for granted." 
Looking forward to a relevant upcoming event, "The Science of Fiction" presented by 5D Institute in association wth USC School of Cinematic Arts (April 2013) that will expand on this concept (and others) it applies to the art of production design for film, books and games.

"Science fiction prototyping, design fiction, and world building are all established narrative devices which engage the power of Fiction to grab elusive or as yet unrevealed Science and disrupt current thinking to provoke new discoveries." 
3D PRINTING- RAPID PROTOTYPING


Rapid prototyping technology most familiar to me before the advent of 3D printing, used in the creation of vehicles and other props for films and theme parks, is the subtractive process known as Computer Numerical Control (CNC) milling. In the entertainment industry companies such as Trans FX  (TFX) used this method for creating a number of the Batmobiles currently on tour for example.
Batman vehicles on tour
I started hearing about 3D printing a few years back, but the first 3D printed object I ever saw was a small chair created for use in a scale model of a theatrical production. It was part of a student design showcase for Theater Stage Design (Kennedy Center American College Theater Festival) held here on Los Angeles in 2010. Soon after a successful jawbone replacement built from a 3D printer made the news.
Artificial jawbone created by a 3D printer 

'Skyfall' filmmakers dropped some of their $150 million-plus budget on 3D-printed scale replicas of Bond's classic Aston Martin DB5.
"Voxeljet, a 3D printing company in Germany, created three 1:3 scale models of the rare DB5. Each model was made from 18 separate components that were assembled much like a real car. The massive VX4000 printer could have cranked out a whole car, but the parts method created models with doors and hoods that could open and close.






The completed models received the famous DB5 chrome paint job and bullet hole details as finishing touches during final assembly at Propshop Modelmakers in the U.K. One of the models was sacrificed to the stunt gods during filming. Another was sold by Christie's for almost $100,000. "  Read this article by Amanda Kooser on CNET.





3D printing for environmental miniatures is currently being developed further by VFX pioneer Doug Trumbull known for his work on 2001: A Space Odyssey. Trumbull was in Los Angeles this year to accept the Visual Effects Society's Georges Méliès Award.  I had the privilege of meeting him when he also attended and spoke at the 4th meeting of the Virtual Production Committee in February 2012.
Doug Trumbull at his studio in the Berkshire Hills of Massachusetts
Virtual production is essentially digital compositing of 3D elements in camera (real & virtual or virtual & virtual) in real time. Doug Trumbull has been combining elements in camera since the 70's with his Magicam System.

Trumbull is a proponent of filming optical effects and prefers using real miniatures rather than CGI for compositing. He is currently expanding on his own virtual production technique at his state-of-the-art studio in Massachusetts. 
"We can make miniatures look absolutely real, that isn’t a variable. I recently looked at Blade Runner, Close Encounters and 2001 in my screening room on Blu-ray, and I could see everything that was in the original prints. Sometimes it is even better, because the grain and slight weave of physical projection is gone. All these years later the miniatures hold up and are not the slightest bit obsolete due to CGI. Miniatures are used so rarely, they are practically a lost art, though Hugo shows how successfully they can still be employed."
"Most directors aren’t comfortable in a virtual world, something I found out long ago with Magicam. Many actors, having learned their craft on a near-empty theater stage, are more comfortable. And I found that showing actors the composite on stage thrills them. “Finally, I don’t have to fake it.” If you don’t have something to show them, you wind up like 300, where everybody’s faking it because they have no solid idea about the virtual environment! My next step – something I haven’t done before except in brief experiments – is to replace the computer-generated, real-time virtual set with a miniature, which I find much more photo-realistic and believable than anything generated in a computer. Then I use Nuke and other comp techniques as needed, though I’m aiming for every shot to have at least 80 percent physical reality, rather than settling for the algorithm of the month. My tastes have always run to more organic approaches to visual effects." [ICG Magazine interview: Exposure: Douglass Trumbull 4.4.12]
Trumbull discusses his role in the history in filmmaking in another great interview by Wofram Hannemann in May at FMX Conference in Stuttgart Germany [in70mm.com 5.10.12]

At Siggraph 2012 Conference in Los Angeles this year 3DSystems showcased CUBE the first @home 3D printer. Attendees could print their own 3D files into physical objects (ABS plastic) from working machines provided at a hands-on demo booth.
Siggraph Attendees playing with Cube printers

The new stop-motion feature "ParaNorman" uses full-color Zprinted puppets.


ParaNorman 3d printed puppet faces
Read an in-depth article by Brian Heater with great photos: How 3D Printing Changed the Face of 'ParaNorman'



Given that we already have 3D laser scanners in use and can create accurate digital files to replicate and/or design new objects with relative ease, the applications for 3D printing truly are going to change the world.


on Tuesday, September 18, 2012
I am intrigued by the ways designers are thinking about art and architecture as our means of graphic interface become increasingly sophisticated. Other forms of interface are becoming invisible (especially with regard to "smart buildings.") Interesting and related post I came across online today: What Comes After the Touchscreen?



5D Institute has curated a new series of discussions that will be delving into this topic later this week.  All of the speakers are excellent and Kevin Slavin, Greg Lynn and Peter Frankfurt are among my favorites.

 New City - design Greg Lynn, Peter Frankfurt, Alex McDowell; image courtesy Greg Lynn/ FORM
 When the city and the book become both virtual and interactive, and contain and fuel multiple scenarios which evolve and coexist within synthetic worlds, what new stories can we tell?
In association with USC School of Cinematic Arts, the 5D Institute invites you to join our diverse and interdisciplinary network of writers, architects, engineers and artists in a multi-panel provocative and disruptive discussion of the possibilities of dynamic environments in digital publishing, virtual architecture, and interactive media, and the role of world building in the future of storytelling.

Some of the mind-bending questions to be addressed during 2 sessions this week:

On hybrid spaces-
As information is liberated from concrete and paper how does new data transform the material from which it came?

On architect as storyteller-
Does the multi-authored narrative of the city inform our view into the future of a new kind of storytelling experience?

On new dimensions of story architecture-
Can the novelist's command of world-building be challenged and enriched by a new dimension that enables multi-authored pathways?

On creating story spaces-
In a flow towards the Virtual City and Interactive Book, what are the differences between them? How can we integrate the newly built worlds of the City and the Book; and who now are the authors of these spaces?

on Sunday, July 29, 2012




Digital Storytelling Seminar, now in its 7th year, took place in May at the Tancred Theater at Filmens Hus in Oslo, Norway. As an active member of 5D Institute and 5D Conference, I was delighted to be able to accept an invitation to attend this partner event.

Digital Storytelling Seminar (DS) is the brainchild of 5D Conference founders Kim Baumann Larsen, Creative Director and partner at Placebo Effects, an Oslo-based VFX company, and Eric Hanson, Principal, xRex Studio and Associated Professor, USC School of Cinematic Arts. In 2005 both realized they missed attending the defunct 3D Festival in Copenhagen, an annual event said to be "Europe's leading meeting place for professionals working in digital design, filmmaking, game development and architectural design." They sought out Angela Amoroso, DS co-founder and co-director, at the Norwegian Film Institute and launched the first DS seminar and workshops in 2006, providing a new platform to discuss the advances in computer-generated imagery (CGI) in the service of visual storytelling.

DS connects and enriches digital artists and filmmakers in the Nordic community and beyond.
Each year new technologies and a wide range of projects are showcased by participants. Themes have varied, but the objective remains the same: To explore relevant and cutting edge topics that educate and inspire, leading to new levels of filmmaking excellence. This year was no exception. Interesting to note that many of the small participating companies are formed of CGI generalists, self-dubbed "Potatoes" that produce a wide array independent projects, and work co-operatively on larger projects, such as soon-to-be released Kon-Tiki, the newest Norwegian adventure film based on Norwegian explorer and writer Thor Heyerdahl and his 1947 rafting expedition across the Pacific Ocean from South America to the Polynesian Islands. This film was made mostly with Norwegian VFX talents and had a record-breaking budget. Swedish companies Fido and Very Important Pirates did a fair share of the VFX work as well.


I met Dadi Einarsson, co-founder and Creative Director of Framestore Iceland, who sought out DS to become involved and widen the Nordic filmmaking community to include Iceland (where before DS was primarily Scandinavian.) He spoke passionately about his desire to bring more VFX work back home to Iceland. Framestore Iceland has doubled in size within 3 years of operation. Globalization within the VFX industry enabled him to direct a commercial in Iceland for an agency in London with a client in China. They never met in person, yet Dadi delivered the spot in four weeks with previs animatic the first week and the rest followed --all online.

(Latest news is that Icelandic director and actor Baltasar Kormakur has bought the Icelandic Framestore branch in partnership with Dadi Einarsson.)

Stand-out key presenters included Alpo Oksaharju and Mikko Kallinen, from Finland. They are the 2-man development team of Theory Interactive, a new indie game company who combined forces to produce Reset, a single-player, first person puzzle game with a time-travel convention. They have developed their own real-time game engine PRAXIS, and in the process, created a cinematic teaser trailer to test their work.


The lighting and texture are beautifully rendered, giving the trailer extraordinary depth and emotion, and this has set core gamers on their ear. It has also triggered buy-out offers from larger companies, which Alpo and Mikko have refused. They have pursued a creative dream on their own terms and prefer to keep it that way. There is no set release date for the much-anticipated game.

Participants in the "VFX Omelet" (short showcase segment) included Norwegian "regulars": Storm Studios, Gimpville, Qvisten Animation, Stripe, Electric Putty, Netron, Placebo Effects and newcomer Framestore Iceland Studio.

The list of high-quality and eclectic presentations:

Contraband (feature film) show and tell -Supporting visual effects from Iceland -
Dadi Einarsson, Creative Director/VFX Supervisor, Framestore Iceland

Creating Digital Water: The process of creating simulated water and oceans and the problems to be aware of -their work on Kon Tiki - Magnus Petterson, Lead Effects, TD Storm Studios

Stereoscopic Set Extension: Behind the scenes of the thrilling mine-cart sequence in the feature film Blåfjell 2. - Lars Erik Hansen, VFX Supervisor/Producer, Gimpville

Twigson in Trouble: a brief talk about the VFX that Qvisten Animation did on Twigson 3-
Martin Skarbø, VFX Supervisor and Morten Øverlie, Animation Supervisor, Qvisten

Hugo’s There – The Unbelievable Truth: Adding a little character with e>motion. Peter Spence, VFX Artist
, Electric Putty


Spence made reference in his presentation to the [5D] production mandala originally conceived by Alex McDowell, 5D Founder and Creative Director, to illustrate modern filmmaking -- a circular chart that shows overlapping input from all departments, with design at the core. In Spence’s version as creator and sole executor of Hugo, it was himself on the mandala at beginning, middle and end. He does it all, another example of creative vision pursued on the artist’s own terms.
MODERN FILMMAKING PROCESS

Fredriksten Fortress – 350 years in eleven minutes: The history of Fredriksten Fortress, an outdoor projection on the walls of the inner fortress -Jørgen Aanonsen, Managing Director and Torgeir Holm, Creative Director for 3D and VFX
, Netron

The Cows Are On Your Table: the challenge of augmenting more than 100,000 breakfast tables with a digital story
-Kim Baumann Larsen, Creative Director and Partner, 
Placebo Effects

This last VFX Omelet presentation of an augmented reality app was among the most novel. Using an ipad aimed at a QR code on a milk carton, a scene comes to life in the ipad. 2 small cow characters appear to come from behind the carton and graze on the table, talking to each other and interacting with the viewer. One can see the potential well beyond the breakfast table for this idea. Kim will be among the presenters at Siggraph next month.

Keynote Speaker Sebastian Sylwan, CTO Weta Digital also 5D Founder, in his address entitled “The Evolution of Storytelling Through the Convergence of Art and Science” spoke about history of the role of art and science in service of story, leading off with Leonardo Da Vinci’s painting of Last Supper citing how artists have always used technology, in this case perspective, to enhance the telling of the visual story.

Jumping forward, in an interview for a local Norwegian online magazine Sylwan spoke about the non-linearity technology has afforded filmmakers today: “Traditionally, film production has evolved over the last 120 years to include specific tasks. Because film was expensive, you needed to plan very well what needed to be done when you were in front of the camera… Nowadays, the visual effects part of that has grown a lot, compared to where it was even 10 years ago. That growth has been accompanied by a lot of technological advancements, and those technological advancements have enabled some of the creative parameters to be extended, experimented with, brought forward, involving impulses from various areas. In short, I think we are talking about a type of virtual production now. There isn’t really a checklist in my mind or a definition of what virtual production is or isn’t, but I think it’s basically about using virtual tools in order to make films. So it’s really production with the best tools available, and those tools are able to enable communication between creative figures in the film making process that would normally not talk to each other.”

Sylwan also presented a Weta show-reel and took numerous questions about their past projects including Rise of the Planet of the Apes, which highlighted Weta’s advances in facial performance capture, particularly the eyes.

The final Key presenter was Swedish VFX studio Fido a company that specializes in creatures. Originally 4 companies, (4 people) merged to form Fido in 2005, and have kept the same staff and added 18 artists and have 45 workstations for peak times. They are recently acquired by film production company Forestlight groups are sister company to Noble Entertainment- one of the largest distributors in Scandinavia. FIDO balances longer-term feature work with shorter-term commercials and use the same teams for both. There are no department splits. Fido has a very-well developed fur and feather pipeline as well as a system for creating realistic ocean surfaces.

Claes Dietmann, Producer at Fido, presented a show reel featuring Fido’s impressive VFX work on creatures on 3 production: German feature Yoko, Underworld Awakening, and Kon Tiki.

Yoko is a children’s story of a young yeti (Himalayan snowman) with magical powers.


Fido designed the creature from start to finish and built an animatronic character (costume with animatronic head) for many of the shots as well as all the matching CG character shots and props.

Staffan Linder presented the werewolf transformation they created for Underworld Awakening, a film project they started, then lost to Canada and were later awarded the transformation animation sequence again. They created a horrific transformation that was a success in being unique among all other werewolf transformations before seen.

Mattias Lindahl presented Fido’s creation of the larger-than-life whale shark that attacks the raft in Kon Tiki. They spent a year on this film - which will have a U. S. release without subtitles. Each scene was shot twice; once in Norwegian and again in English.

Digital Storytelling and Grafill have curated an exhibit of Kon Tiki concept art that will open August 29, 2012 at Grafill gallery in Olso. http://www.grafill.no/nyhet/storm-studios-fra-storyboard-til-kon-tiki-bilder


The incredible growth in Nordic VFX production in recent years and following the success of films such as Max Manus (Made entirely with Norwegian VFX talent), Troll Hunter, and Kon Tiki, make Digital Storytelling Seminar not only interesting and informative, but also alot of FUN and my thanks to my hostess Angela Amoroso, for without her hospitality my attendance would not have been possible.











on Monday, April 23, 2012


5D|Flux, a new series of forums that encompass design for film, animation, gaming, theater, interactive media and more, was introduced to the U.S. in March 2012 by the 5D Institute, the USC School of Cinematic Arts and Autodesk. The initial event in the series, curated by Juan DiazB, took place in February in Berlin at the Berlin Film Festival.
The program, entitled Digital Design & World Building in the Narrative Media Landscape, took place over three consecutive evenings at the Ray Stark Theater on the University of Southern California Campus with a near-capacity crowd each night. (insert hot link: http://5dconference.com/articles/824)

Curators Alex McDowell, Francois Audouy and Peggy Weil assembled diverse group of professional luminaries who provided fresh insight toward current technology in relation to production design practice and workflow for hybrid filmmaking and animation.  Presentations and panels explored three phases of the design process: Inception, Prototyping and Manufacturing & Capture. Interesting case studies included the making of Upside Down by Alex McDowell, How to Tame Your Dragon by Patrick Hanenberger and Tintin in virtual production by Andrew Jones and Jeff Wisniewski. Also of note were excellent presentations by Autodesk Fellow Tom Wujec on the future of design and an overview of the history of virtual production by David Morin.

Key presentations can be viewed online at http://vimeo.com/channels/5dconference

   Photo by Dave Blass

Panel discussions that followed were apart from the case studies. The audience, a good mix of students, educators and industry professionals engaged the panelists with many more provocative questions than could be answered in the time allotted. Audience interaction via a live ‘backchannel’ website, accessed through personal mobile devices was a particular success. The questions, projected on a full-stage screen behind the panelists, were addressed based on priority ranking from the participants.


James Iliff, attending USC Student in Interactive Entertainment posted in his blog about the event:

     Photo by Dave Blass

A wine and cheese reception sponsored by 5thKind held in the adjoining lobby each evening provided ample opportunity for networking; a big hit with everyone judging from the number who stayed afterward.

Future 5D|Flux design summits are being planned for this year. Director Alex McDowell explains, ”Each will focus on world building as the main track, but not be limited to filmmaking. The arcing goal for the 5D|Flux series is to ensure that those who attend all of these accumulate applicable knowledge.”

Photo by Dave Blass
on Thursday, March 1, 2012
UPDATE REAL STEEL article with photos in latest issue of Perspective Magazine

REAL STEEL Process + Design: a case study of working in virtual production 
by Judy Cosgrove
“…[Virtual Production] is a radical departure from the way we think about film production and the way that we will think about narrative. It is the beginning of a new conceptual and visual language."
                            Jeff Wisniewski, Virtual Art Director, Real Steel, Tintin, Avatar

I wrote a piece earlier about the success of digital previs as the new hub of efficient production planning, citing Real Steel, and how according to the Producers, Previs and Virtual Production kept the film within the parameters of their studio budget and timeframe. 


Real Steel has a seamless blending of real and virtual, and is the first of its kind to be made in virtual production both in studio and on location.


PROCESS

Virtual production is the process of shooting a movie with real-time computer graphics in a collaborative and interactive way. The CGI can be (but is not limited to) the actor’s mo-cap performances composited in camera with a CGI environment or real-world environment and vice versa. This process is used extensively in the creation of modern video games, and was further developed for filmmaking by James Cameron on Avatar.

In Real Steel, pre-produced motion-captured virtual performances of digital robot characters were integrated (through in-camera playback) with real actors performing, and shot live action, in real time, in the real world. Virtual Production as a non-linear and iterative process was proved effective beginning with virtual design and set elements (digital assets) created within the Art Department.

The Production Designer

Led by Production Designer Tom Meyer, the Art Department created not only the environments but also the hero robot characters for Real Steel; unique given that animated characters are usually designed by animators, VFX artists, or sculptors in “creature shops.”
DreamWorks wanted to keep the robot designs tied closely to the overall concept of the film, and were looking for a candidate to lead both the environment and character design.


Pre-Production

Meyer started preproduction on September 22, 2009, heading up the team of concept artists and began collaboration on the hero robots, taking the lead in designing the robot character of Atom himself. Initial concept artwork illustrated in 2D was then modeled in 3D or directly modeled in 3D from the start. Meyer prefers to finalize designs in 3D rather than rely solely on traditional 2D illustration, since 3D assets can be used by VFX vendors and without translation. Legacy Effects, the company that created animatronic versions of the robots worked directly from these digital files, and as did Digital Domain for the CGI versions.

Designing in 3D, with the aid of computer animation, allowed Meyer to develop the robot characters more fully through the exploration of their movement. He could show how hero characters Atom and Zeus might interact with each other under realistic lighting conditions and suggest camera angles. Texture, mood and attitude could be conveyed more concretely than in 2D, and by the first week of October he was quickly sharing these ideas with the director and the production team.)

These same concepts applied to environments for doing virtual production. Working in a virtual paradigm first created in the making of Avatar, it was clear that digitally designed set “assets” were needed prior to the mo-cap shoot. Executive Producer Josh McLaglen had asked Meyer for the fight containment (boxing ring) to be designed by the end of January. Meyer agreed and realized they would need to see whatever was behind them in the virtual camera as well, so in fact he had all of the sets they needed virtual filming completed by the end of February. 

The robot designs were well under way by the second week of October when Meyer started scouting Detroit.  The set designers began to work from LA off architectural plans with minimal dimensions and reference photographs. The set designs were developed virtually in 3D, sometimes independent of their Detroit environs, which were to be determined later, such as arena architecture that was comped and textured in post.

Having an accurate model of design for the location in the computer allowed time for creative experimentation for shots ahead of time, creating a template for the shooting location. One example is a field scouted outside of Detroit.  Meyer had a virtual model created of the fairgrounds design for that field (could be any field) in Rhino and animated the opening shot of truck’s approach to the fairgrounds in Maya within the Art Department.  All other 3D set designs created in the Art Department were all looked at in previs in conjunction with VFX Supervisor Erik Nash (Digital Domain) to figure out what really needed to be built and what would be created as digital set extensions. .  The Art Department provided cross-platform .OBJ files for Previs Supervisor Casey Schatz (Giant) to break down for previs, and to work with Meyer exploring the design in the location virtually with time of day, light and color.

(Interesting to note that the previs team from Giant Studios did their own 3D lidar (laser) scans to generate accurate models of the environments (built and existing) once on location to aid VFX and post production as per usual. Why not give to art department up front? Industry standards for Virtual Production are being addressed. This is discussed further in my concluding remarks)


Supervising Art Director Seth Reed explains: “There were no Art Department scouts to the locations in Detroit, no site surveys prepared until we got there as an entire art department at the end of March. A certain amount of work was done while in LA, but it was only work that we thought we could use in open spaces where we were building from scratch.   Even then, making assumptions about what could be done was tricky.”

“As Supervising Art Director” says Reed, “I still did the same thing I always do, in managing the Art Department - drawings, models, illustrations, budget, schedule, coordination with the producers, graphics, construction, but it also includes translating into the physical world, the work of the Virtual Art Department’s mo-cap work. It's great to design virtually, to model and to previs.  Often, these things still need to be built though.  At that point, we are back to practical matters - how would this really stand up or how would this really be built?  On this shoot there was a tremendous overlap with mo-cap and the virtual art departments 3D modeled designs. There was a whole new area to figure out - what had already been shot?  What were our limitations when designing the physically built set that would actually be shot live? We had to find locations that fit and build practical sets that were interactive with virtual characters.  When a robot was hit and reached back to touch a rope or a rock or another object, this object had been shot already as a piece of foam - we had to supply the object touched, in the exact position and placement already shot, per dimensions and scale adjustments provided by VFX.  The in-camera work that you refer to is only a partial.  Much of our work had already taken place by the time the company got there to shoot.”

(Reed has some recommendations for ‘best practices’ that I’ve included at the end of this article.)

Jeff Wisniewski, Art Director Virtual Art Department and Mo-cap worked within his department to “strip down” high-poly models created in the Art Department and hand off to Giant Studios, the company doing the mo-cap shoot. ‘Strip down’ refers to reducing the poly-count (number of facets) in order to function in Motion Builder, the primary software tool used for real-time rendering when doing virtual cinematography. Wisniewski states “I always have one person on my team designated to stripping models for that reason.”  


Previsualization

 “What used to be a very obtuse experience is now very direct and realistic in terms accountability and budgeting”, says Meyer. “All departments collaborate and decide on what is necessary and affordable and the director gets what he sees in the concept phase when he arrives on location.  You are not selling an abstract idea. In addition, from a budget standpoint they spent a fraction of what they might have spent in the past, before having this technology.  This tool effectively collapsed a difficult design process, turning it into something that could be easily understood.”

Previs Supervisor Casey Schatz emphasizes the need in previs to respect the physics and limits of the live-action shooting environment. He acts as a “reality cop” and avoids letting the filmmaker be seduced by the possibilities the computer is capable of, or any moves that do not obey the live-action world. On Real Steel, Schatz programmed safeguard markers into the virtual camera for that purpose. Indicators would flash on the monitor whenever a move made virtually would not be possible on site at the location.

Schatz textured and lit set design models under the direction of the Art Department. He combined models of the virtual sets with Giant’s models of the locations generated from their surveys and lidar scans along with his own virtual proxies of the camera equipment being used. In this way, Schatz could be certain of real-world parameters. His goal was to give the director an experience as close as possible to being on location in Detroit.

For the real world build in the Metal Valley sequence, previs was used to explore the way the same cliff (built set) could be shot to make Meyer’s cliff design appear twice as high, by set-ups one way as top on the cliff, then the reverse for the bottom of the cliff.  In addition, this particular set-up on location required a number of large cranes for several different departments.  Schatz was able to trouble-shoot the crane placement in the limited space virtually.

Previs proved to be an invaluable asset again when the location changed at the last minute for the Zoo sequence. The in-camera fight work had be done in advance, so could that be adapted anywhere, but at the new location Schatz noticed there were obtrusive shadows cast from existing structures. He was able to reproduce the location virtually and with additional programming, correctly project the sun with time of day, latitude and longitude, for and accurate light study predicting the fall of the shadows in advance.

Motion-Capture Shoot

Art Director Jeff Wisniewski supervised the 6-week motion-capture portion of Real Steel that began early February 2010 on stage at Giant Studios.  He was already an expert in the process, having been Art Director on Avatar and Tintin, two productions using motion-capture, virtual camera, and simulcam. The mo-cap shoot now in virtual production was not for previs, but real footage that would be played back in simulcam during the live action shoot on location.

Wisniewski oversaw mapping the pre-designed virtual environment to the stage and the providing of proxy set pieces in full scale for the actor to interact with as needed. This included creating construction drawings for any props and set pieces that had motion (such as a gate) or an actor otherwise interacted with.  (They built a mock-up for the animated bull that they rolled around as part of the action for the bullfight in the first part of the film. The motion capture from this mock-up gave VFX enough information to create an incredibly realistic finished product.

During this part of the filmmaking process the actors performed choreographed fight sequences in the full-scale proxy environment. Mo-cap cameras rigged around the volume space tracked and recorded their movement. This information was fed into a computer and the actor’s performance was targeted onto the pre-loaded digital robot characters. Using the virtual camera tool, the Director shot video-game resolution sequences of robots fighting in real-time within the digital environments generated by the art department.  Motion-capture for the robot fight sequences was completed by March 14, and Wisniewski went on to prep the location.

Once in Detroit, Wisniewski oversaw the construction of the main boxing rings at Cobo. He noted how by working virtually they were able to solve the scale-offset problem created by the 8’ robots, but the real challenge was in physically constructing them on location. “The boxing ring at Giant Studios was scaled for human actors. The boxing rings in Detroit, where they shot the giant robots using simulcam needed to be scaled up 125% in the real world, (a 6’ tall person on stage had to be an 8’ tall robot at practical location) which is huge.” He collaborated extensively with each department to bring both the digital sets on physical sets on budget. The digital files also enabled Wisniewski to contract with local car parts manufacturers using CNC to fabricate the over-scale parts needed to build the ring. 


DESIGN
The Story World

Real Steel is based on “Steel”, a short story by Richard Matheson published in 1956 and later made into an episode of the Twilight Zone in 1964. Screenwriter John Gatins adapted the original story and shifted the script to focus on the relationship between the father and son in more of an “Americana” setting.  The story takes in place in the near future, 2020, and revolves around a down-on-his-luck ex-boxer and his young son struggling to make their way in the new world of robot boxing as fighting ‘bot owners.

The world of the future in Real Steel closely resembles the world of today, with some advanced technology such as cell phones and computers.  Meyer worked closely with DreamWorks product placement partners HP, NOKIA, and BEATS to develop and design new “concept” products that hewed to the ideals that the companies were developing for their own future lines.  Cadillac provided a concept car driven by Charlie Kenton. (need pic)

Detroit, a classic American city grounded in manufacturing, was an ideal location for the movie according to Meyer who was interviewed by Brian Gallagher (MovieWeb.com) while on location:

Can you talk about filming here in Detroit and how that has influenced your design?

Tom Meyer:  “Michigan in general has a lot of great things…We filmed in the original Model-T factory in Highland Park, which we re-purposed as an industrial opera house...There are also beautiful rolling hillsides …and these classic small towns, with the courthouse in the center. You get that whole mesh of Americana, this cross-section of history and technology.”

Can you talk about your inspirations for the WRB?

Tom Meyer: “. The physicality of boxing is about energy and energy displacement, at its core. It's two guys beating each other up, but when you think about what the ring is designed to do, you have a sprung floor that bounces with the boxers, you have ropes that absorb, it's all about energy absorption. …with huge steel cables with lots of spring and tension and all this cable coming off is just an extension of the corner post. It's kind of like a turnbuckle on steroids. It's to help see that energy, so [the robots] feel somewhat humanized… Every single nut and bolt was custom fabricated. 80% of our crew is from the industrial base of Detroit, the unions, steelworkers, and carpenters. It's a manufacturing town.”


The Robot Characters

The main challenge in designing the robot characters according to Meyer was in avoiding all established franchises. The robots needed to have a utilitarian look to them and look able to function in reality, complete with gears and pistons, and also retain an emotional humanistic quality. The decision was made early on to have Legacy Effects create full-scale practical animatronic versions the hero robots Atom, Ambush, Noisy Boy, Axelrod (robots that Kenton owns and/or works on), in addition to the digital versions created by Digital Domain.

Concept Artists within the Art Department worked in close collaboration with Meyer in giving each robot a distinct personality. Tim Flattery, the chair of Art Center’s Entertainment Design program designed Spitfire, Albino, Axelrod, Twin Cities and Midas. “These aren’t your typical robots,” said Flattery of the characters he and other concept artists designed (including fellow Art Center faculty member Daren Dochterman, Simon Murton, Andrew Leung and Victor Martinez.). “They’re all very stylized and ridiculous, yet somehow, in the world of the movie they make sense.”

Digital Domain worked with Art Department and Legacy to finalize robot designs and mechanics that could be applied to practical robots and CG models alike. “The practical Legacy ‘bots were invaluable for lighting and texture data,” according to VFX Supervisor Erik Nash, “as they provided a tangible point of [lighting] reference for digital characters that needed to be indistinguishable from the real thing. Digital Domain used the Art Department 3D digital assets directly to model, texture, and rig eight unique, hero robots for the fight sequences: Ambush, Noisy Boy, Midas, Atom, Metro, Blacktop, Twin Cities, and Zeus, in addition to numerous background robots that appear throughout the film…(Artofvfx.com)

Legacy Effects built three practical robots – Ambush, Noisy Boy and Atom plus partially destroyed Axelrod – that were used extensively throughout production for shots requiring human contact and upper body animation.  They started by printing 3D prototypes at 1/5 scale from the Maya files that originated in the Art Department.  They exploded the files and broke down into all the parts to be assembled. In this way they were able to fabricate multiples of separate parts.


CONCLUSION

Real Steel was a perfect situation for virtual production but this film could not be made without the technical prowess of every department involved, and I would be remiss not to mention Virtual Production Supervisor Glenn Derry, who developed virtual camera and simulcam systems originally for Avatar. His method of tracking virtual camera data to feed Avid editorial
is a tremendous aid to effective collaboration. This method generated useful information for the previs team and helped the Assistant Directors plan their days by enabling them to sort shots and group like shots together.  Derry’s mobile “live action engineering” truck allowed them to edit sequences in Avid on site and to be sent back to Digital Domain who could turn around final render quality shots within a day during principal photography.

Virtual production allows for real-time creative collaboration between all departments during production. For Virtual Production to be effective, it requires early commitment to design and directorial decisions. When you are in Virtual Production, you are not previsualizing; you are making the movie.

Wisniewski: "Because the technology works so well I am not sure we understand the accomplishments made on Real Steel. This film is a hybrid form that will influence virtual filmmaking. It is a platform to blend the future, the past, the present and the imagination seamlessly, it is hyperviz.” 


THE LEARNING CURVE

Virtual Production is a “front-loaded process” that moves away from the traditional mindset of “figuring it out in post” where many problems are solved after the fact. Design and previs is key to early collaboration for all departments including cinematography.

Real Steel Producer Steve Molen (DreamWorks) spoke from the audience at a Real Steel panel presentation I attended for the Virtual Production Committee*: “There are no lines between preproduction and post- it’s all one process.” He urged that Previs/VFX companies “sweep proprietary software aside” and use “tools involved on the virtual side that are ingestible and compatible throughout the process. It would be great if everyone used the same bridging tools. Too many VFX and previs companies have proprietary software that they end up using as shadow programs… We should be pushing the [virtual production] process forward in the same direction.”

*The Joint Technology Subcommittee on Virtual Production Committee is a round-table of industry professionals VPC) co-chaired by David Morin and John Scheele, is currently engaged in steering the industry toward ‘best practices.

Supervising Art Director Seth Reed looks back on his experience on Real Steel and offers some suggestions for the Art Departments working Virtual Production going forward:

1. I would love to learn more about Lidar, especially the latest.  The information that we had on RS was useful but was not accurate enough to use for more than conceptual work.

2.  Though many parts of the virtual movie were shot prior to arrival in Detroit, there was still much remaining and we were in the curious position of finding locations and designing sets for sequences that had already been shot.

3.  Having the virtual Art Director, Jeff, with us on location was essential.  He participated in all that had occurred on stage plus he had a fantastic communication with the VFX people. He was the key, lynchpin between Art and VFX.

4.  We are still looking for a more streamlining between Maya (or Rhino, etc) and Motion Builder.  Everything must be converted, this can mean drawing things twice, especially if the Maya models become very complex.

5.  We learned a lot on this, it was a process.  Both worlds affected each other, the practical and virtual.  We made a virtual reality at Giant; we then had a physical reality on location that the virtual robot actually interacts with.  We had to build that reality!  Example, we captured the Zoo sequence on stage, in a ring, with ropes.  Then, we found a great location, an abandoned zoo.  We re-contexted our ring footage to the practical location that they are fighting around rocks and fallen trees and the robot falls against a rock instead of a rope!  This was a great blend of VFX and Art Departments working together.

Art Department Credits
Tom Meyer –production designer
Seth Reed – supervising art director
Jeff Wisniewski - mo-cap art director/ locations
Jason Baldwin –Stewart – art director
Tino Schaedler - art director
Bradley Rubin – assistant art director

Trevor Goring- storyboard artist
Joe Venti – storyboard artist
Michael Anthony Jackson – storyboard artist

Daren Dochterman – concept artist
Tim Flattery – concept artist
Andrew H. Leung – concept artist
Victor J Martinez – concept artist
Simon Murton- concept Illustrator
Jamie Raima – concept illustrator
Miles Teves –concept artist

Michael C. Biddle –set designer
Tim Croshaw – set designer
Tex Kadonaga – set designer
David Moreau – set designer
Anne Porter – set designer
Theodore Sharps set designer
Mike Stassi – set designer

Andrea Carter – art dept. coordinator
Jourdan Henderson – art depr coordinator
Jennifer Bash – art dept. assistant
Rick Radomski  - art dept. assistant
Lisa Fiorito – art dept. researcher
Ceri Glowacki – art dept. production assistant

William Eliscu – lead graphic designer
Megan Greydanus – graphic designer
Mike Maher – graphic designer